42613 1727204570.31810: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 42613 1727204570.33097: Added group all to inventory 42613 1727204570.33100: Added group ungrouped to inventory 42613 1727204570.33104: Group all now contains ungrouped 42613 1727204570.33108: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 42613 1727204570.79338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 42613 1727204570.79612: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 42613 1727204570.79640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 42613 1727204570.79709: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 42613 1727204570.79993: Loaded config def from plugin (inventory/script) 42613 1727204570.79995: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 42613 1727204570.80042: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 42613 1727204570.80143: Loaded config def from plugin (inventory/yaml) 42613 1727204570.80145: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 42613 1727204570.80450: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 42613 1727204570.81355: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 42613 1727204570.81360: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 42613 1727204570.81364: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 42613 1727204570.81577: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 42613 1727204570.81583: Loading data from /tmp/network-jrl/inventory-0Xx.yml 42613 1727204570.81663: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 42613 1727204570.81735: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 42613 1727204570.81989: Loading data from /tmp/network-jrl/inventory-0Xx.yml 42613 1727204570.82092: group all already in inventory 42613 1727204570.82100: set inventory_file for managed-node1 42613 1727204570.82105: set inventory_dir for managed-node1 42613 1727204570.82106: Added host managed-node1 to inventory 42613 1727204570.82108: Added host managed-node1 to group all 42613 1727204570.82109: set ansible_host for managed-node1 42613 1727204570.82110: set ansible_ssh_extra_args for managed-node1 42613 1727204570.82113: set inventory_file for managed-node2 42613 1727204570.82116: set inventory_dir for managed-node2 42613 1727204570.82117: Added host managed-node2 to inventory 42613 1727204570.82119: Added host managed-node2 to group all 42613 1727204570.82119: set ansible_host for managed-node2 42613 1727204570.82120: set ansible_ssh_extra_args for managed-node2 42613 1727204570.82123: set inventory_file for managed-node3 42613 1727204570.82125: set inventory_dir for managed-node3 42613 1727204570.82126: Added host managed-node3 to inventory 42613 1727204570.82127: Added host managed-node3 to group all 42613 1727204570.82128: set ansible_host for managed-node3 42613 1727204570.82129: set ansible_ssh_extra_args for managed-node3 42613 1727204570.82133: Reconcile groups and hosts in inventory. 42613 1727204570.82137: Group ungrouped now contains managed-node1 42613 1727204570.82139: Group ungrouped now contains managed-node2 42613 1727204570.82140: Group ungrouped now contains managed-node3 42613 1727204570.82442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 42613 1727204570.82792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 42613 1727204570.82842: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 42613 1727204570.82875: Loaded config def from plugin (vars/host_group_vars) 42613 1727204570.82878: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 42613 1727204570.82887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 42613 1727204570.82896: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 42613 1727204570.82946: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 42613 1727204570.83747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204570.84067: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 42613 1727204570.84114: Loaded config def from plugin (connection/local) 42613 1727204570.84117: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 42613 1727204570.85719: Loaded config def from plugin (connection/paramiko_ssh) 42613 1727204570.85724: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 42613 1727204570.88030: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 42613 1727204570.88081: Loaded config def from plugin (connection/psrp) 42613 1727204570.88085: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 42613 1727204570.89988: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 42613 1727204570.90041: Loaded config def from plugin (connection/ssh) 42613 1727204570.90045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 42613 1727204570.95411: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 42613 1727204570.95458: Loaded config def from plugin (connection/winrm) 42613 1727204570.95461: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 42613 1727204570.95806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 42613 1727204570.95894: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 42613 1727204570.96171: Loaded config def from plugin (shell/cmd) 42613 1727204570.96174: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 42613 1727204570.96206: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 42613 1727204570.96284: Loaded config def from plugin (shell/powershell) 42613 1727204570.96286: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 42613 1727204570.96346: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 42613 1727204570.96756: Loaded config def from plugin (shell/sh) 42613 1727204570.96759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 42613 1727204570.97056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 42613 1727204570.97206: Loaded config def from plugin (become/runas) 42613 1727204570.97209: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 42613 1727204570.97731: Loaded config def from plugin (become/su) 42613 1727204570.97735: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 42613 1727204570.98126: Loaded config def from plugin (become/sudo) 42613 1727204570.98129: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 42613 1727204570.98377: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 42613 1727204570.98962: in VariableManager get_vars() 42613 1727204570.99193: done with get_vars() 42613 1727204570.99338: trying /usr/local/lib/python3.12/site-packages/ansible/modules 42613 1727204571.06115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 42613 1727204571.06446: in VariableManager get_vars() 42613 1727204571.06452: done with get_vars() 42613 1727204571.06455: variable 'playbook_dir' from source: magic vars 42613 1727204571.06455: variable 'ansible_playbook_python' from source: magic vars 42613 1727204571.06456: variable 'ansible_config_file' from source: magic vars 42613 1727204571.06457: variable 'groups' from source: magic vars 42613 1727204571.06458: variable 'omit' from source: magic vars 42613 1727204571.06458: variable 'ansible_version' from source: magic vars 42613 1727204571.06459: variable 'ansible_check_mode' from source: magic vars 42613 1727204571.06460: variable 'ansible_diff_mode' from source: magic vars 42613 1727204571.06460: variable 'ansible_forks' from source: magic vars 42613 1727204571.06461: variable 'ansible_inventory_sources' from source: magic vars 42613 1727204571.06462: variable 'ansible_skip_tags' from source: magic vars 42613 1727204571.06462: variable 'ansible_limit' from source: magic vars 42613 1727204571.06463: variable 'ansible_run_tags' from source: magic vars 42613 1727204571.06464: variable 'ansible_verbosity' from source: magic vars 42613 1727204571.06511: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml 42613 1727204571.09315: in VariableManager get_vars() 42613 1727204571.09338: done with get_vars() 42613 1727204571.09384: in VariableManager get_vars() 42613 1727204571.09399: done with get_vars() 42613 1727204571.09437: in VariableManager get_vars() 42613 1727204571.09451: done with get_vars() 42613 1727204571.09540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 42613 1727204571.10087: in VariableManager get_vars() 42613 1727204571.10106: done with get_vars() 42613 1727204571.10113: variable 'omit' from source: magic vars 42613 1727204571.10135: variable 'omit' from source: magic vars 42613 1727204571.10174: in VariableManager get_vars() 42613 1727204571.10187: done with get_vars() 42613 1727204571.10239: in VariableManager get_vars() 42613 1727204571.10254: done with get_vars() 42613 1727204571.10298: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 42613 1727204571.10753: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 42613 1727204571.11106: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 42613 1727204571.12751: in VariableManager get_vars() 42613 1727204571.12983: done with get_vars() 42613 1727204571.13886: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 42613 1727204571.23585: in VariableManager get_vars() 42613 1727204571.23589: done with get_vars() 42613 1727204571.23592: variable 'playbook_dir' from source: magic vars 42613 1727204571.23593: variable 'ansible_playbook_python' from source: magic vars 42613 1727204571.23594: variable 'ansible_config_file' from source: magic vars 42613 1727204571.23595: variable 'groups' from source: magic vars 42613 1727204571.23595: variable 'omit' from source: magic vars 42613 1727204571.23596: variable 'ansible_version' from source: magic vars 42613 1727204571.23597: variable 'ansible_check_mode' from source: magic vars 42613 1727204571.23598: variable 'ansible_diff_mode' from source: magic vars 42613 1727204571.23599: variable 'ansible_forks' from source: magic vars 42613 1727204571.23599: variable 'ansible_inventory_sources' from source: magic vars 42613 1727204571.23600: variable 'ansible_skip_tags' from source: magic vars 42613 1727204571.23601: variable 'ansible_limit' from source: magic vars 42613 1727204571.23602: variable 'ansible_run_tags' from source: magic vars 42613 1727204571.23602: variable 'ansible_verbosity' from source: magic vars 42613 1727204571.23647: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 42613 1727204571.23733: in VariableManager get_vars() 42613 1727204571.23737: done with get_vars() 42613 1727204571.23740: variable 'playbook_dir' from source: magic vars 42613 1727204571.23741: variable 'ansible_playbook_python' from source: magic vars 42613 1727204571.23741: variable 'ansible_config_file' from source: magic vars 42613 1727204571.23742: variable 'groups' from source: magic vars 42613 1727204571.23743: variable 'omit' from source: magic vars 42613 1727204571.23744: variable 'ansible_version' from source: magic vars 42613 1727204571.23744: variable 'ansible_check_mode' from source: magic vars 42613 1727204571.23745: variable 'ansible_diff_mode' from source: magic vars 42613 1727204571.23746: variable 'ansible_forks' from source: magic vars 42613 1727204571.23747: variable 'ansible_inventory_sources' from source: magic vars 42613 1727204571.23747: variable 'ansible_skip_tags' from source: magic vars 42613 1727204571.23748: variable 'ansible_limit' from source: magic vars 42613 1727204571.23749: variable 'ansible_run_tags' from source: magic vars 42613 1727204571.23750: variable 'ansible_verbosity' from source: magic vars 42613 1727204571.23994: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 42613 1727204571.24080: in VariableManager get_vars() 42613 1727204571.24096: done with get_vars() 42613 1727204571.24145: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 42613 1727204571.24493: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 42613 1727204571.24779: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 42613 1727204571.25612: in VariableManager get_vars() 42613 1727204571.25639: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 42613 1727204571.29354: in VariableManager get_vars() 42613 1727204571.29505: done with get_vars() 42613 1727204571.29551: in VariableManager get_vars() 42613 1727204571.29554: done with get_vars() 42613 1727204571.29557: variable 'playbook_dir' from source: magic vars 42613 1727204571.29558: variable 'ansible_playbook_python' from source: magic vars 42613 1727204571.29559: variable 'ansible_config_file' from source: magic vars 42613 1727204571.29559: variable 'groups' from source: magic vars 42613 1727204571.29560: variable 'omit' from source: magic vars 42613 1727204571.29561: variable 'ansible_version' from source: magic vars 42613 1727204571.29562: variable 'ansible_check_mode' from source: magic vars 42613 1727204571.29563: variable 'ansible_diff_mode' from source: magic vars 42613 1727204571.29564: variable 'ansible_forks' from source: magic vars 42613 1727204571.29564: variable 'ansible_inventory_sources' from source: magic vars 42613 1727204571.29669: variable 'ansible_skip_tags' from source: magic vars 42613 1727204571.29671: variable 'ansible_limit' from source: magic vars 42613 1727204571.29672: variable 'ansible_run_tags' from source: magic vars 42613 1727204571.29673: variable 'ansible_verbosity' from source: magic vars 42613 1727204571.29714: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 42613 1727204571.29888: in VariableManager get_vars() 42613 1727204571.29903: done with get_vars() 42613 1727204571.29949: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 42613 1727204571.30289: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 42613 1727204571.30681: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 42613 1727204571.31353: in VariableManager get_vars() 42613 1727204571.31381: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 42613 1727204571.34998: in VariableManager get_vars() 42613 1727204571.35018: done with get_vars() 42613 1727204571.35056: in VariableManager get_vars() 42613 1727204571.35277: done with get_vars() 42613 1727204571.35321: in VariableManager get_vars() 42613 1727204571.35334: done with get_vars() 42613 1727204571.35417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 42613 1727204571.35434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 42613 1727204571.36135: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 42613 1727204571.36643: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 42613 1727204571.36647: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 42613 1727204571.36793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 42613 1727204571.36826: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 42613 1727204571.37133: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 42613 1727204571.37444: Loaded config def from plugin (callback/default) 42613 1727204571.37448: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 42613 1727204571.38884: Loaded config def from plugin (callback/junit) 42613 1727204571.38888: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 42613 1727204571.38944: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 42613 1727204571.39022: Loaded config def from plugin (callback/minimal) 42613 1727204571.39025: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 42613 1727204571.39070: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 42613 1727204571.39135: Loaded config def from plugin (callback/tree) 42613 1727204571.39138: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 42613 1727204571.39271: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 42613 1727204571.39274: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_routing_rules_nm.yml ******************************************* 6 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 42613 1727204571.39303: in VariableManager get_vars() 42613 1727204571.39320: done with get_vars() 42613 1727204571.39326: in VariableManager get_vars() 42613 1727204571.39335: done with get_vars() 42613 1727204571.39339: variable 'omit' from source: magic vars 42613 1727204571.39383: in VariableManager get_vars() 42613 1727204571.39399: done with get_vars() 42613 1727204571.39422: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_routing_rules.yml' with nm as provider] **** 42613 1727204571.40732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 42613 1727204571.40923: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 42613 1727204571.41111: getting the remaining hosts for this loop 42613 1727204571.41113: done getting the remaining hosts for this loop 42613 1727204571.41122: getting the next task for host managed-node3 42613 1727204571.41126: done getting next task for host managed-node3 42613 1727204571.41129: ^ task is: TASK: Gathering Facts 42613 1727204571.41131: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204571.41133: getting variables 42613 1727204571.41135: in VariableManager get_vars() 42613 1727204571.41149: Calling all_inventory to load vars for managed-node3 42613 1727204571.41152: Calling groups_inventory to load vars for managed-node3 42613 1727204571.41155: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204571.41199: Calling all_plugins_play to load vars for managed-node3 42613 1727204571.41215: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204571.41219: Calling groups_plugins_play to load vars for managed-node3 42613 1727204571.41258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204571.41313: done with get_vars() 42613 1727204571.41321: done getting variables 42613 1727204571.41751: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Tuesday 24 September 2024 15:02:51 -0400 (0:00:00.025) 0:00:00.025 ***** 42613 1727204571.41783: entering _queue_task() for managed-node3/gather_facts 42613 1727204571.41784: Creating lock for gather_facts 42613 1727204571.42513: worker is 1 (out of 1 available) 42613 1727204571.42525: exiting _queue_task() for managed-node3/gather_facts 42613 1727204571.42539: done queuing things up, now waiting for results queue to drain 42613 1727204571.42541: waiting for pending results... 42613 1727204571.42758: running TaskExecutor() for managed-node3/TASK: Gathering Facts 42613 1727204571.43225: in run() - task 127b8e07-fff9-2f91-05d8-0000000000af 42613 1727204571.43230: variable 'ansible_search_path' from source: unknown 42613 1727204571.43236: calling self._execute() 42613 1727204571.43552: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204571.43556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204571.43559: variable 'omit' from source: magic vars 42613 1727204571.43658: variable 'omit' from source: magic vars 42613 1727204571.43804: variable 'omit' from source: magic vars 42613 1727204571.43919: variable 'omit' from source: magic vars 42613 1727204571.44038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204571.44202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204571.44206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204571.44395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204571.44398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204571.44401: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204571.44404: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204571.44407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204571.44630: Set connection var ansible_shell_executable to /bin/sh 42613 1727204571.44854: Set connection var ansible_pipelining to False 42613 1727204571.44858: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204571.44861: Set connection var ansible_connection to ssh 42613 1727204571.44863: Set connection var ansible_timeout to 10 42613 1727204571.44868: Set connection var ansible_shell_type to sh 42613 1727204571.44870: variable 'ansible_shell_executable' from source: unknown 42613 1727204571.44873: variable 'ansible_connection' from source: unknown 42613 1727204571.44875: variable 'ansible_module_compression' from source: unknown 42613 1727204571.44877: variable 'ansible_shell_type' from source: unknown 42613 1727204571.44880: variable 'ansible_shell_executable' from source: unknown 42613 1727204571.44883: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204571.44886: variable 'ansible_pipelining' from source: unknown 42613 1727204571.44889: variable 'ansible_timeout' from source: unknown 42613 1727204571.44892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204571.45408: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204571.45443: variable 'omit' from source: magic vars 42613 1727204571.45446: starting attempt loop 42613 1727204571.45451: running the handler 42613 1727204571.45551: variable 'ansible_facts' from source: unknown 42613 1727204571.45556: _low_level_execute_command(): starting 42613 1727204571.45559: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204571.47305: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204571.47478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204571.47628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204571.47744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204571.49609: stdout chunk (state=3): >>>/root <<< 42613 1727204571.50126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204571.50132: stdout chunk (state=3): >>><<< 42613 1727204571.50136: stderr chunk (state=3): >>><<< 42613 1727204571.50140: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204571.50144: _low_level_execute_command(): starting 42613 1727204571.50234: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559 `" && echo ansible-tmp-1727204571.5003932-42690-154033137034559="` echo /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559 `" ) && sleep 0' 42613 1727204571.51944: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204571.52056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204571.52169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204571.52319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204571.52431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204571.52631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204571.54870: stdout chunk (state=3): >>>ansible-tmp-1727204571.5003932-42690-154033137034559=/root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559 <<< 42613 1727204571.55211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204571.55216: stderr chunk (state=3): >>><<< 42613 1727204571.55218: stdout chunk (state=3): >>><<< 42613 1727204571.55220: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204571.5003932-42690-154033137034559=/root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204571.55225: variable 'ansible_module_compression' from source: unknown 42613 1727204571.55293: ANSIBALLZ: Using generic lock for ansible.legacy.setup 42613 1727204571.55378: ANSIBALLZ: Acquiring lock 42613 1727204571.55387: ANSIBALLZ: Lock acquired: 139982757271872 42613 1727204571.55395: ANSIBALLZ: Creating module 42613 1727204572.30875: ANSIBALLZ: Writing module into payload 42613 1727204572.31039: ANSIBALLZ: Writing module 42613 1727204572.31146: ANSIBALLZ: Renaming module 42613 1727204572.31209: ANSIBALLZ: Done creating module 42613 1727204572.31256: variable 'ansible_facts' from source: unknown 42613 1727204572.31318: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204572.31390: _low_level_execute_command(): starting 42613 1727204572.31401: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 42613 1727204572.33150: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204572.33230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204572.33415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204572.33419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204572.35182: stdout chunk (state=3): >>>PLATFORM <<< 42613 1727204572.35462: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 42613 1727204572.35470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204572.35689: stderr chunk (state=3): >>><<< 42613 1727204572.35809: stdout chunk (state=3): >>><<< 42613 1727204572.35813: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204572.35819 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 42613 1727204572.36077: _low_level_execute_command(): starting 42613 1727204572.36080: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 42613 1727204572.36294: Sending initial data 42613 1727204572.36303: Sent initial data (1181 bytes) 42613 1727204572.37705: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204572.37804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204572.37835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204572.37935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204572.41973: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 42613 1727204572.42574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204572.42579: stdout chunk (state=3): >>><<< 42613 1727204572.42581: stderr chunk (state=3): >>><<< 42613 1727204572.42584: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204572.42784: variable 'ansible_facts' from source: unknown 42613 1727204572.42891: variable 'ansible_facts' from source: unknown 42613 1727204572.42894: variable 'ansible_module_compression' from source: unknown 42613 1727204572.42898: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 42613 1727204572.43071: variable 'ansible_facts' from source: unknown 42613 1727204572.43470: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/AnsiballZ_setup.py 42613 1727204572.43939: Sending initial data 42613 1727204572.43996: Sent initial data (154 bytes) 42613 1727204572.45214: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204572.45219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204572.45222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204572.45224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204572.45227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204572.45334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204572.45382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204572.45476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204572.47284: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204572.47375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204572.47408: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp9yl0q7n_ /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/AnsiballZ_setup.py <<< 42613 1727204572.47461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/AnsiballZ_setup.py" <<< 42613 1727204572.47702: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp9yl0q7n_" to remote "/root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/AnsiballZ_setup.py" <<< 42613 1727204572.50388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204572.50470: stdout chunk (state=3): >>><<< 42613 1727204572.50475: stderr chunk (state=3): >>><<< 42613 1727204572.50480: done transferring module to remote 42613 1727204572.50482: _low_level_execute_command(): starting 42613 1727204572.50485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/ /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/AnsiballZ_setup.py && sleep 0' 42613 1727204572.51869: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204572.51879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204572.51899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204572.51912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204572.52011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204572.52080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204572.52235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204572.52252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204572.52352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204572.54593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204572.54989: stderr chunk (state=3): >>><<< 42613 1727204572.54993: stdout chunk (state=3): >>><<< 42613 1727204572.55055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204572.55059: _low_level_execute_command(): starting 42613 1727204572.55061: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/AnsiballZ_setup.py && sleep 0' 42613 1727204572.55972: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204572.56064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204572.56311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204572.56315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204572.56338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204572.56354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204572.56484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204572.58981: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 42613 1727204572.59210: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 42613 1727204572.59234: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 42613 1727204572.59288: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204572.59320: stdout chunk (state=3): >>>import '_codecs' # <<< 42613 1727204572.59338: stdout chunk (state=3): >>>import 'codecs' # <<< 42613 1727204572.59375: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 42613 1727204572.59401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 42613 1727204572.59427: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991d18530> <<< 42613 1727204572.59448: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ce7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 42613 1727204572.59541: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991d1aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 42613 1727204572.59577: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 42613 1727204572.59675: stdout chunk (state=3): >>>import '_collections_abc' # <<< 42613 1727204572.59795: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # <<< 42613 1727204572.59798: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 42613 1727204572.59803: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 42613 1727204572.59809: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 42613 1727204572.59819: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 42613 1727204572.59849: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 42613 1727204572.59966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b0d190> <<< 42613 1727204572.60000: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b0e090> <<< 42613 1727204572.60009: stdout chunk (state=3): >>>import 'site' # <<< 42613 1727204572.60011: stdout chunk (state=3): >>> <<< 42613 1727204572.60083: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 42613 1727204572.60772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b4be60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b4bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 42613 1727204572.60823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204572.60850: stdout chunk (state=3): >>>import 'itertools' # <<< 42613 1727204572.60873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b83830> <<< 42613 1727204572.60969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b83ec0> import '_collections' # <<< 42613 1727204572.60996: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b63b30> <<< 42613 1727204572.61000: stdout chunk (state=3): >>>import '_functools' # <<< 42613 1727204572.61024: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b61250> <<< 42613 1727204572.61134: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b49010> <<< 42613 1727204572.61188: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 42613 1727204572.61202: stdout chunk (state=3): >>>import '_sre' # <<< 42613 1727204572.61385: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 42613 1727204572.61457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ba7800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ba6420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b62120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ba4c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b482c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 42613 1727204572.61474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 42613 1727204572.61501: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.61513: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bd8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd8bf0> <<< 42613 1727204572.61544: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.61616: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bd8fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b46de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 42613 1727204572.61660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 42613 1727204572.61676: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd9340> import 'importlib.machinery' # <<< 42613 1727204572.61726: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 42613 1727204572.61840: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bda570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 42613 1727204572.61994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bf5ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf6d80> <<< 42613 1727204572.62039: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.62164: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bf73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bf7e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf7560> <<< 42613 1727204572.62209: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bda5d0> <<< 42613 1727204572.62221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 42613 1727204572.62273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 42613 1727204572.62294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 42613 1727204572.62397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991937da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 42613 1727204572.62400: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991960860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919605c0> <<< 42613 1727204572.62426: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991960770> <<< 42613 1727204572.62455: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9919609b0> <<< 42613 1727204572.62479: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991935f40> <<< 42613 1727204572.62586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 42613 1727204572.62696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991962090> <<< 42613 1727204572.62717: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991960d10> <<< 42613 1727204572.62734: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bdacc0> <<< 42613 1727204572.62754: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 42613 1727204572.62903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 42613 1727204572.62926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99198e420> <<< 42613 1727204572.63029: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 42613 1727204572.63084: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919a6540> <<< 42613 1727204572.63100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 42613 1727204572.63146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 42613 1727204572.63205: stdout chunk (state=3): >>>import 'ntpath' # <<< 42613 1727204572.63245: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919df2f0> <<< 42613 1727204572.63357: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 42613 1727204572.63386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 42613 1727204572.63475: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991a05a90> <<< 42613 1727204572.63563: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919df410> <<< 42613 1727204572.63680: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919a71d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9917f0410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919a5580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991962fc0> <<< 42613 1727204572.63835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 42613 1727204572.63857: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff9919a5940> <<< 42613 1727204572.64116: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_uv0xlw73/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 42613 1727204572.64199: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.64489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 42613 1727204572.64493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99185a120> import '_typing' # <<< 42613 1727204572.64638: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991831010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991830170> <<< 42613 1727204572.64655: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.64677: stdout chunk (state=3): >>>import 'ansible' # <<< 42613 1727204572.64693: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.64721: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.64787: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 42613 1727204572.66444: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.67860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991833fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204572.67888: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 42613 1727204572.68092: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99188dac0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188d850> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188d160> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 42613 1727204572.68096: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188dbb0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99185adb0> <<< 42613 1727204572.68099: stdout chunk (state=3): >>>import 'atexit' # <<< 42613 1727204572.68279: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99188e840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99188ea80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 42613 1727204572.68383: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188efc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 42613 1727204572.68581: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f0da0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9916f29c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f3380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 42613 1727204572.68613: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f4560> <<< 42613 1727204572.68624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 42613 1727204572.68795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f6ff0> <<< 42613 1727204572.68799: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9916f7350> <<< 42613 1727204572.68827: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f52b0> <<< 42613 1727204572.68930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 42613 1727204572.69040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916faf30> import '_tokenize' # <<< 42613 1727204572.69273: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f9a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f9760> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916fbe60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f57c0> <<< 42613 1727204572.69276: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99173f0e0> <<< 42613 1727204572.69315: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 42613 1727204572.69318: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99173f290> <<< 42613 1727204572.69586: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991748d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991748b00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 42613 1727204572.69781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174b260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9917493a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 42613 1727204572.69825: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99174ea80> <<< 42613 1727204572.70147: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99174b410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174f860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.70153: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174fa70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174fb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99173f4a0> <<< 42613 1727204572.70171: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 42613 1727204572.70240: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 42613 1727204572.70467: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991753350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.70485: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991754800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991751ac0> <<< 42613 1727204572.70581: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991752e70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9917516d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 42613 1727204572.70904: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 42613 1727204572.71048: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.71421: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.72246: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.73342: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 42613 1727204572.73497: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915dca10> <<< 42613 1727204572.73617: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 42613 1727204572.73722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915dd820> <<< 42613 1727204572.73738: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991750950> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 42613 1727204572.73763: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.73781: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 42613 1727204572.73863: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.74242: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.74416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915dd5e0> # zipimport: zlib available <<< 42613 1727204572.74896: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.75334: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.75478: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.75622: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 42613 1727204572.75737: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.75762: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 42613 1727204572.75782: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 42613 1727204572.75810: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.75851: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.75891: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 42613 1727204572.75957: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.76350: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.76852: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 42613 1727204572.77272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 42613 1727204572.77277: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915dfe30> # zipimport: zlib available <<< 42613 1727204572.77293: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.77355: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 42613 1727204572.77371: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 42613 1727204572.77400: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 42613 1727204572.77441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 42613 1727204572.77624: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.77791: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915e6330> <<< 42613 1727204572.77875: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915e6c60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991748bf0> <<< 42613 1727204572.78015: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.78052: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 42613 1727204572.78099: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.78169: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.78257: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.78382: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 42613 1727204572.78506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204572.78602: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.78606: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915e5a00> <<< 42613 1727204572.78719: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915e6e70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 42613 1727204572.78746: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.78935: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.78966: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.79035: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 42613 1727204572.79062: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 42613 1727204572.79174: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 42613 1727204572.79218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 42613 1727204572.79246: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 42613 1727204572.79292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 42613 1727204572.79443: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99167ee40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915f3ce0> <<< 42613 1727204572.79622: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915eed50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915eeba0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 42613 1727204572.79741: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 42613 1727204572.79985: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.80049: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.80053: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.80085: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.80283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.80330: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.80348: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.80384: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 42613 1727204572.80585: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.80600: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.80987: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 42613 1727204572.81091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991685c40> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 42613 1727204572.81138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 42613 1727204572.81185: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 42613 1727204572.81494: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b54410> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990b54800> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991665460> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991664560> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916842f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916846e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 42613 1727204572.81527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 42613 1727204572.81716: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990b577a0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b57050> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990b57230> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b56480> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 42613 1727204572.81992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b57860> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990bba360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bb8380> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916853a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 42613 1727204572.81995: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.81998: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 42613 1727204572.82011: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.82080: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.82243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.82295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 42613 1727204572.82299: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 42613 1727204572.82379: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 42613 1727204572.82499: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.82502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 42613 1727204572.82597: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # <<< 42613 1727204572.82612: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.82884: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 42613 1727204572.83440: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.83947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 42613 1727204572.83993: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.84146: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 42613 1727204572.84194: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.84400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 42613 1727204572.84441: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.84445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 42613 1727204572.84448: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.84545: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 42613 1727204572.84705: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.84749: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bba570> <<< 42613 1727204572.84752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 42613 1727204572.84880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 42613 1727204572.85173: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bbb1d0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 42613 1727204572.85192: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.85338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 42613 1727204572.85586: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 42613 1727204572.85624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 42613 1727204572.85783: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990bea690> <<< 42613 1727204572.86045: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bd64e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 42613 1727204572.86080: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.86478: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.86503: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.86595: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 42613 1727204572.86609: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 42613 1727204572.86697: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 42613 1727204572.86712: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.86749: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.86810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 42613 1727204572.86814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 42613 1727204572.86856: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204572.86877: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990a060f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a06060> <<< 42613 1727204572.86903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 42613 1727204572.86922: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 42613 1727204572.86936: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.87035: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 42613 1727204572.87039: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.87207: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.87379: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 42613 1727204572.87497: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.87872: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.87972: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.88079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 42613 1727204572.88094: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.88257: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.88366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 42613 1727204572.88490: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.89187: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.89753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 42613 1727204572.90017: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 42613 1727204572.90197: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.90202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 42613 1727204572.90368: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.90560: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 42613 1727204572.90643: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 42613 1727204572.90646: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.90708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 42613 1727204572.90956: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.91157: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.91471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 42613 1727204572.91506: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.91536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 42613 1727204572.91539: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.91896: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 42613 1727204572.91900: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.91959: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.92088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 42613 1727204572.92344: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.92637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 42613 1727204572.92787: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 42613 1727204572.92816: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.92852: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 42613 1727204572.92894: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.92941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 42613 1727204572.92970: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.93014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 42613 1727204572.93115: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.93224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 42613 1727204572.93228: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.93251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 42613 1727204572.93289: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.93448: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 42613 1727204572.93452: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204572.93495: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.93573: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.93787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 42613 1727204572.93791: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 42613 1727204572.93804: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.94022: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.94244: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 42613 1727204572.94386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 42613 1727204572.94505: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.94528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 42613 1727204572.94555: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.94653: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 42613 1727204572.94679: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.94760: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.94862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 42613 1727204572.94956: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204572.96607: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 42613 1727204572.96612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990a2f620> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a2cb00> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a2d9d0> <<< 42613 1727204573.08174: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 42613 1727204573.08197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a74710> <<< 42613 1727204573.08235: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 42613 1727204573.08253: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 42613 1727204573.08302: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a755e0> <<< 42613 1727204573.08369: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 42613 1727204573.08782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a77920> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a766c0> <<< 42613 1727204573.08814: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame<<< 42613 1727204573.08905: stdout chunk (state=3): >>> PyThreadState_Clear: warning: thread still has a frame <<< 42613 1727204573.08930: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 42613 1727204573.09085: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 42613 1727204573.35105: stdout chunk (state=3): >>> <<< 42613 1727204573.35122: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "52", "epoch": "1727204572", "epoch_int": "1727204572", "date": "2024-09-24", "time": "15:02:52", "iso8601_micro": "2024-09-24T19:02:52.957469Z", "iso8601": "2024-09-24T19:02:52Z", "iso8601_basic": "20240924T150252957469", "iso8601_basic_short": "20240924T150252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam"<<< 42613 1727204573.35234: stdout chunk (state=3): >>>, "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3046, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 670, "free": 3046}, "nocache": {"free": 3490, "used": 226}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 910, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303927808, "block_size": 4096, "block_total": 64479564, "block_available": 61353498, "block_used": 3126066, "inode_total": 16384000, "inode_available": 16301443, "inode_used": 82557, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20d<<< 42613 1727204573.35268: stdout chunk (state=3): >>>b391"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.712890625, "5m": 0.654296875, "15m": 0.42626953125}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ip<<< 42613 1727204573.35296: stdout chunk (state=3): >>>v6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 42613 1727204573.36342: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins<<< 42613 1727204573.36385: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc<<< 42613 1727204573.36551: stdout chunk (state=3): >>> # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime<<< 42613 1727204573.36829: stdout chunk (state=3): >>> # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl <<< 42613 1727204573.36835: stdout chunk (state=3): >>># cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios<<< 42613 1727204573.36856: stdout chunk (state=3): >>> # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly<<< 42613 1727204573.36896: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos <<< 42613 1727204573.36903: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos<<< 42613 1727204573.36943: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter<<< 42613 1727204573.36983: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base<<< 42613 1727204573.37008: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos <<< 42613 1727204573.37045: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin <<< 42613 1727204573.37075: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata <<< 42613 1727204573.37146: stdout chunk (state=3): >>># cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 42613 1727204573.37927: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 42613 1727204573.38089: stdout chunk (state=3): >>> # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression<<< 42613 1727204573.38112: stdout chunk (state=3): >>> # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 42613 1727204573.38182: stdout chunk (state=3): >>> # destroy ntpath <<< 42613 1727204573.38278: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon<<< 42613 1727204573.38281: stdout chunk (state=3): >>> # destroy hashlib # destroy json.decoder # destroy json.encoder<<< 42613 1727204573.38383: stdout chunk (state=3): >>> # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess<<< 42613 1727204573.38509: stdout chunk (state=3): >>> # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 42613 1727204573.38602: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 42613 1727204573.38748: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool<<< 42613 1727204573.38751: stdout chunk (state=3): >>> # destroy signal <<< 42613 1727204573.38781: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction<<< 42613 1727204573.38949: stdout chunk (state=3): >>> # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl<<< 42613 1727204573.38977: stdout chunk (state=3): >>> <<< 42613 1727204573.38993: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 42613 1727204573.39024: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios<<< 42613 1727204573.39094: stdout chunk (state=3): >>> # destroy json <<< 42613 1727204573.39103: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 42613 1727204573.39241: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util<<< 42613 1727204573.39340: stdout chunk (state=3): >>> # destroy _multiprocessing # destroy array <<< 42613 1727204573.39386: stdout chunk (state=3): >>># destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves<<< 42613 1727204573.39427: stdout chunk (state=3): >>> # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime<<< 42613 1727204573.39451: stdout chunk (state=3): >>> # cleanup[3] wiping traceback # destroy linecache<<< 42613 1727204573.39525: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437<<< 42613 1727204573.39608: stdout chunk (state=3): >>> # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math<<< 42613 1727204573.39611: stdout chunk (state=3): >>> # cleanup[3] wiping warnings<<< 42613 1727204573.39614: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 42613 1727204573.39685: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 42613 1727204573.39806: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 42613 1727204573.39813: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath<<< 42613 1727204573.39878: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 42613 1727204573.39910: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 42613 1727204573.39916: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux <<< 42613 1727204573.40130: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 42613 1727204573.40244: stdout chunk (state=3): >>># destroy sys.monitoring <<< 42613 1727204573.40271: stdout chunk (state=3): >>># destroy _socket <<< 42613 1727204573.40292: stdout chunk (state=3): >>># destroy _collections <<< 42613 1727204573.40327: stdout chunk (state=3): >>># destroy platform # destroy _uuid<<< 42613 1727204573.40387: stdout chunk (state=3): >>> # destroy stat # destroy genericpath <<< 42613 1727204573.40457: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize<<< 42613 1727204573.40562: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 42613 1727204573.40604: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp <<< 42613 1727204573.40692: stdout chunk (state=3): >>># destroy _io # destroy marshal # clear sys.meta_path <<< 42613 1727204573.40707: stdout chunk (state=3): >>># clear sys.modules # destroy _frozen_importlib<<< 42613 1727204573.40736: stdout chunk (state=3): >>> <<< 42613 1727204573.40874: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 42613 1727204573.40941: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 42613 1727204573.41004: stdout chunk (state=3): >>> # destroy _random <<< 42613 1727204573.41077: stdout chunk (state=3): >>># destroy _weakref # destroy _operator <<< 42613 1727204573.41110: stdout chunk (state=3): >>># destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc <<< 42613 1727204573.41155: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks<<< 42613 1727204573.41183: stdout chunk (state=3): >>> <<< 42613 1727204573.41975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204573.41979: stdout chunk (state=3): >>><<< 42613 1727204573.41981: stderr chunk (state=3): >>><<< 42613 1727204573.42302: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991d18530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ce7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991d1aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b0d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b0e090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b4be60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b4bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b83830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b83ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b63b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b61250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b49010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ba7800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ba6420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b62120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991ba4c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b482c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bd8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bd8fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991b46de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bd9340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bda570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bf5ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf6d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bf73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991bf7e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bf7560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bda5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991937da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991960860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919605c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991960770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9919609b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991935f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991962090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991960d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991bdacc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99198e420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919a6540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919df2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991a05a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919df410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919a71d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9917f0410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9919a5580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991962fc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff9919a5940> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_uv0xlw73/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99185a120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991831010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991830170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991833fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99188dac0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188d850> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188d160> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188dbb0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99185adb0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99188e840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99188ea80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99188efc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f0da0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9916f29c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f3380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f4560> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f6ff0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9916f7350> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f52b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916faf30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f9a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f9760> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916fbe60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916f57c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99173f0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99173f290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991748d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991748b00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174b260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9917493a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99174ea80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99174b410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174f860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174fa70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff99174fb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99173f4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991753350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991754800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991751ac0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff991752e70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9917516d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915dca10> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915dd820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991750950> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915dd5e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915dfe30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915e6330> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915e6c60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991748bf0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9915e5a00> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915e6e70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff99167ee40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915f3ce0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915eed50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9915eeba0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991685c40> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b54410> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990b54800> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991665460> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff991664560> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916842f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916846e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990b577a0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b57050> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990b57230> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b56480> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990b57860> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990bba360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bb8380> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9916853a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bba570> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bbb1d0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990bea690> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990bd64e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990a060f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a06060> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff990a2f620> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a2cb00> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a2d9d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a74710> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a755e0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a77920> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff990a766c0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "52", "epoch": "1727204572", "epoch_int": "1727204572", "date": "2024-09-24", "time": "15:02:52", "iso8601_micro": "2024-09-24T19:02:52.957469Z", "iso8601": "2024-09-24T19:02:52Z", "iso8601_basic": "20240924T150252957469", "iso8601_basic_short": "20240924T150252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3046, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 670, "free": 3046}, "nocache": {"free": 3490, "used": 226}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 910, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303927808, "block_size": 4096, "block_total": 64479564, "block_available": 61353498, "block_used": 3126066, "inode_total": 16384000, "inode_available": 16301443, "inode_used": 82557, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.712890625, "5m": 0.654296875, "15m": 0.42626953125}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 42613 1727204573.44385: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204573.44389: _low_level_execute_command(): starting 42613 1727204573.44392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204571.5003932-42690-154033137034559/ > /dev/null 2>&1 && sleep 0' 42613 1727204573.44777: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204573.44961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204573.45012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204573.45221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204573.48196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204573.48200: stdout chunk (state=3): >>><<< 42613 1727204573.48204: stderr chunk (state=3): >>><<< 42613 1727204573.48223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204573.48278: handler run complete 42613 1727204573.48593: variable 'ansible_facts' from source: unknown 42613 1727204573.48597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204573.48993: variable 'ansible_facts' from source: unknown 42613 1727204573.49102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204573.49390: attempt loop complete, returning result 42613 1727204573.49409: _execute() done 42613 1727204573.49571: dumping result to json 42613 1727204573.49574: done dumping result, returning 42613 1727204573.49576: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-2f91-05d8-0000000000af] 42613 1727204573.49579: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000af 42613 1727204573.49929: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000af 42613 1727204573.49933: WORKER PROCESS EXITING ok: [managed-node3] 42613 1727204573.50321: no more pending results, returning what we have 42613 1727204573.50324: results queue empty 42613 1727204573.50331: checking for any_errors_fatal 42613 1727204573.50335: done checking for any_errors_fatal 42613 1727204573.50336: checking for max_fail_percentage 42613 1727204573.50338: done checking for max_fail_percentage 42613 1727204573.50339: checking to see if all hosts have failed and the running result is not ok 42613 1727204573.50340: done checking to see if all hosts have failed 42613 1727204573.50341: getting the remaining hosts for this loop 42613 1727204573.50343: done getting the remaining hosts for this loop 42613 1727204573.50348: getting the next task for host managed-node3 42613 1727204573.50354: done getting next task for host managed-node3 42613 1727204573.50355: ^ task is: TASK: meta (flush_handlers) 42613 1727204573.50358: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204573.50362: getting variables 42613 1727204573.50363: in VariableManager get_vars() 42613 1727204573.50390: Calling all_inventory to load vars for managed-node3 42613 1727204573.50392: Calling groups_inventory to load vars for managed-node3 42613 1727204573.50396: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204573.50407: Calling all_plugins_play to load vars for managed-node3 42613 1727204573.50409: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204573.50412: Calling groups_plugins_play to load vars for managed-node3 42613 1727204573.50701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204573.50934: done with get_vars() 42613 1727204573.50950: done getting variables 42613 1727204573.51029: in VariableManager get_vars() 42613 1727204573.51040: Calling all_inventory to load vars for managed-node3 42613 1727204573.51043: Calling groups_inventory to load vars for managed-node3 42613 1727204573.51045: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204573.51051: Calling all_plugins_play to load vars for managed-node3 42613 1727204573.51053: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204573.51056: Calling groups_plugins_play to load vars for managed-node3 42613 1727204573.51242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204573.51471: done with get_vars() 42613 1727204573.51489: done queuing things up, now waiting for results queue to drain 42613 1727204573.51491: results queue empty 42613 1727204573.51492: checking for any_errors_fatal 42613 1727204573.51494: done checking for any_errors_fatal 42613 1727204573.51495: checking for max_fail_percentage 42613 1727204573.51496: done checking for max_fail_percentage 42613 1727204573.51497: checking to see if all hosts have failed and the running result is not ok 42613 1727204573.51498: done checking to see if all hosts have failed 42613 1727204573.51504: getting the remaining hosts for this loop 42613 1727204573.51505: done getting the remaining hosts for this loop 42613 1727204573.51508: getting the next task for host managed-node3 42613 1727204573.51513: done getting next task for host managed-node3 42613 1727204573.51516: ^ task is: TASK: Include the task 'el_repo_setup.yml' 42613 1727204573.51517: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204573.51519: getting variables 42613 1727204573.51520: in VariableManager get_vars() 42613 1727204573.51530: Calling all_inventory to load vars for managed-node3 42613 1727204573.51533: Calling groups_inventory to load vars for managed-node3 42613 1727204573.51535: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204573.51541: Calling all_plugins_play to load vars for managed-node3 42613 1727204573.51544: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204573.51546: Calling groups_plugins_play to load vars for managed-node3 42613 1727204573.51713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204573.51938: done with get_vars() 42613 1727204573.51948: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:11 Tuesday 24 September 2024 15:02:53 -0400 (0:00:02.102) 0:00:02.128 ***** 42613 1727204573.52040: entering _queue_task() for managed-node3/include_tasks 42613 1727204573.52042: Creating lock for include_tasks 42613 1727204573.52413: worker is 1 (out of 1 available) 42613 1727204573.52429: exiting _queue_task() for managed-node3/include_tasks 42613 1727204573.52445: done queuing things up, now waiting for results queue to drain 42613 1727204573.52447: waiting for pending results... 42613 1727204573.52744: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 42613 1727204573.52837: in run() - task 127b8e07-fff9-2f91-05d8-000000000006 42613 1727204573.52841: variable 'ansible_search_path' from source: unknown 42613 1727204573.52943: calling self._execute() 42613 1727204573.52981: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204573.52994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204573.53010: variable 'omit' from source: magic vars 42613 1727204573.53137: _execute() done 42613 1727204573.53146: dumping result to json 42613 1727204573.53159: done dumping result, returning 42613 1727204573.53174: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-2f91-05d8-000000000006] 42613 1727204573.53375: sending task result for task 127b8e07-fff9-2f91-05d8-000000000006 42613 1727204573.53468: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000006 42613 1727204573.53472: WORKER PROCESS EXITING 42613 1727204573.53520: no more pending results, returning what we have 42613 1727204573.53525: in VariableManager get_vars() 42613 1727204573.53556: Calling all_inventory to load vars for managed-node3 42613 1727204573.53558: Calling groups_inventory to load vars for managed-node3 42613 1727204573.53561: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204573.53575: Calling all_plugins_play to load vars for managed-node3 42613 1727204573.53577: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204573.53580: Calling groups_plugins_play to load vars for managed-node3 42613 1727204573.53813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204573.54030: done with get_vars() 42613 1727204573.54039: variable 'ansible_search_path' from source: unknown 42613 1727204573.54056: we have included files to process 42613 1727204573.54058: generating all_blocks data 42613 1727204573.54059: done generating all_blocks data 42613 1727204573.54060: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 42613 1727204573.54061: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 42613 1727204573.54064: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 42613 1727204573.54780: in VariableManager get_vars() 42613 1727204573.54799: done with get_vars() 42613 1727204573.54813: done processing included file 42613 1727204573.54816: iterating over new_blocks loaded from include file 42613 1727204573.54818: in VariableManager get_vars() 42613 1727204573.54829: done with get_vars() 42613 1727204573.54830: filtering new block on tags 42613 1727204573.54846: done filtering new block on tags 42613 1727204573.54849: in VariableManager get_vars() 42613 1727204573.54860: done with get_vars() 42613 1727204573.54861: filtering new block on tags 42613 1727204573.54880: done filtering new block on tags 42613 1727204573.54883: in VariableManager get_vars() 42613 1727204573.54894: done with get_vars() 42613 1727204573.54895: filtering new block on tags 42613 1727204573.54909: done filtering new block on tags 42613 1727204573.54911: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 42613 1727204573.54918: extending task lists for all hosts with included blocks 42613 1727204573.55003: done extending task lists 42613 1727204573.55004: done processing included files 42613 1727204573.55005: results queue empty 42613 1727204573.55006: checking for any_errors_fatal 42613 1727204573.55008: done checking for any_errors_fatal 42613 1727204573.55009: checking for max_fail_percentage 42613 1727204573.55010: done checking for max_fail_percentage 42613 1727204573.55010: checking to see if all hosts have failed and the running result is not ok 42613 1727204573.55011: done checking to see if all hosts have failed 42613 1727204573.55012: getting the remaining hosts for this loop 42613 1727204573.55013: done getting the remaining hosts for this loop 42613 1727204573.55016: getting the next task for host managed-node3 42613 1727204573.55020: done getting next task for host managed-node3 42613 1727204573.55022: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 42613 1727204573.55025: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204573.55027: getting variables 42613 1727204573.55028: in VariableManager get_vars() 42613 1727204573.55037: Calling all_inventory to load vars for managed-node3 42613 1727204573.55039: Calling groups_inventory to load vars for managed-node3 42613 1727204573.55042: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204573.55047: Calling all_plugins_play to load vars for managed-node3 42613 1727204573.55050: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204573.55053: Calling groups_plugins_play to load vars for managed-node3 42613 1727204573.55225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204573.55460: done with get_vars() 42613 1727204573.55472: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:02:53 -0400 (0:00:00.035) 0:00:02.163 ***** 42613 1727204573.55545: entering _queue_task() for managed-node3/setup 42613 1727204573.55894: worker is 1 (out of 1 available) 42613 1727204573.55908: exiting _queue_task() for managed-node3/setup 42613 1727204573.55921: done queuing things up, now waiting for results queue to drain 42613 1727204573.55922: waiting for pending results... 42613 1727204573.56385: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 42613 1727204573.56391: in run() - task 127b8e07-fff9-2f91-05d8-0000000000c0 42613 1727204573.56394: variable 'ansible_search_path' from source: unknown 42613 1727204573.56396: variable 'ansible_search_path' from source: unknown 42613 1727204573.56399: calling self._execute() 42613 1727204573.56468: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204573.56482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204573.56497: variable 'omit' from source: magic vars 42613 1727204573.57082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204573.60054: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204573.60142: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204573.60189: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204573.60234: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204573.60265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204573.60378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204573.60421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204573.60453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204573.60501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204573.60634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204573.60727: variable 'ansible_facts' from source: unknown 42613 1727204573.60819: variable 'network_test_required_facts' from source: task vars 42613 1727204573.60873: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 42613 1727204573.60884: variable 'omit' from source: magic vars 42613 1727204573.60923: variable 'omit' from source: magic vars 42613 1727204573.60961: variable 'omit' from source: magic vars 42613 1727204573.60993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204573.61023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204573.61046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204573.61071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204573.61087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204573.61121: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204573.61127: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204573.61134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204573.61256: Set connection var ansible_shell_executable to /bin/sh 42613 1727204573.61270: Set connection var ansible_pipelining to False 42613 1727204573.61289: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204573.61296: Set connection var ansible_connection to ssh 42613 1727204573.61307: Set connection var ansible_timeout to 10 42613 1727204573.61314: Set connection var ansible_shell_type to sh 42613 1727204573.61347: variable 'ansible_shell_executable' from source: unknown 42613 1727204573.61391: variable 'ansible_connection' from source: unknown 42613 1727204573.61394: variable 'ansible_module_compression' from source: unknown 42613 1727204573.61397: variable 'ansible_shell_type' from source: unknown 42613 1727204573.61399: variable 'ansible_shell_executable' from source: unknown 42613 1727204573.61401: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204573.61403: variable 'ansible_pipelining' from source: unknown 42613 1727204573.61405: variable 'ansible_timeout' from source: unknown 42613 1727204573.61407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204573.61687: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204573.61708: variable 'omit' from source: magic vars 42613 1727204573.61784: starting attempt loop 42613 1727204573.61787: running the handler 42613 1727204573.61836: _low_level_execute_command(): starting 42613 1727204573.61850: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204573.62809: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204573.62907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204573.62977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204573.63029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204573.63033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204573.63233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204573.65444: stdout chunk (state=3): >>>/root <<< 42613 1727204573.65653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204573.65679: stderr chunk (state=3): >>><<< 42613 1727204573.65683: stdout chunk (state=3): >>><<< 42613 1727204573.65705: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204573.65717: _low_level_execute_command(): starting 42613 1727204573.65724: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753 `" && echo ansible-tmp-1727204573.657051-42761-267739743121753="` echo /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753 `" ) && sleep 0' 42613 1727204573.66244: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204573.66249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204573.66252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204573.66311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204573.66315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204573.66435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204573.68713: stdout chunk (state=3): >>>ansible-tmp-1727204573.657051-42761-267739743121753=/root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753 <<< 42613 1727204573.68732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204573.68789: stderr chunk (state=3): >>><<< 42613 1727204573.68804: stdout chunk (state=3): >>><<< 42613 1727204573.68833: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204573.657051-42761-267739743121753=/root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204573.68896: variable 'ansible_module_compression' from source: unknown 42613 1727204573.68971: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 42613 1727204573.69025: variable 'ansible_facts' from source: unknown 42613 1727204573.69235: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/AnsiballZ_setup.py 42613 1727204573.69459: Sending initial data 42613 1727204573.69463: Sent initial data (153 bytes) 42613 1727204573.70337: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204573.70377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204573.70399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204573.70488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204573.70599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204573.72406: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204573.72490: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204573.72574: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpmjyjxgo5 /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/AnsiballZ_setup.py <<< 42613 1727204573.72581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/AnsiballZ_setup.py" <<< 42613 1727204573.72634: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpmjyjxgo5" to remote "/root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/AnsiballZ_setup.py" <<< 42613 1727204573.74997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204573.75012: stderr chunk (state=3): >>><<< 42613 1727204573.75019: stdout chunk (state=3): >>><<< 42613 1727204573.75055: done transferring module to remote 42613 1727204573.75188: _low_level_execute_command(): starting 42613 1727204573.75192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/ /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/AnsiballZ_setup.py && sleep 0' 42613 1727204573.76252: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204573.76463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204573.76534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204573.78697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204573.78701: stdout chunk (state=3): >>><<< 42613 1727204573.78703: stderr chunk (state=3): >>><<< 42613 1727204573.78821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204573.78825: _low_level_execute_command(): starting 42613 1727204573.78828: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/AnsiballZ_setup.py && sleep 0' 42613 1727204573.79539: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204573.79559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204573.79580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204573.79599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204573.79652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204573.79718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204573.79749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204573.79774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204573.80037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204573.82478: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 42613 1727204573.82503: stdout chunk (state=3): >>>import _imp # builtin <<< 42613 1727204573.82530: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 42613 1727204573.82551: stdout chunk (state=3): >>>import '_weakref' # <<< 42613 1727204573.82619: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 42613 1727204573.82655: stdout chunk (state=3): >>>import 'posix' # <<< 42613 1727204573.82694: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 42613 1727204573.82737: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 42613 1727204573.82748: stdout chunk (state=3): >>># installed zipimport hook <<< 42613 1727204573.82791: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204573.82830: stdout chunk (state=3): >>>import '_codecs' # <<< 42613 1727204573.82853: stdout chunk (state=3): >>>import 'codecs' # <<< 42613 1727204573.82884: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 42613 1727204573.82942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259118530> <<< 42613 1727204573.82945: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2590e7b30> <<< 42613 1727204573.82989: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb25911aab0> <<< 42613 1727204573.82995: stdout chunk (state=3): >>>import '_signal' # <<< 42613 1727204573.83025: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 42613 1727204573.83046: stdout chunk (state=3): >>>import 'io' # <<< 42613 1727204573.83086: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 42613 1727204573.83182: stdout chunk (state=3): >>>import '_collections_abc' # <<< 42613 1727204573.83224: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 42613 1727204573.83247: stdout chunk (state=3): >>>import 'os' # <<< 42613 1727204573.83323: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 42613 1727204573.83329: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 42613 1727204573.83358: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 42613 1727204573.83387: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 42613 1727204573.83400: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f2d190> <<< 42613 1727204573.83476: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 42613 1727204573.83482: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f2e090> <<< 42613 1727204573.83509: stdout chunk (state=3): >>>import 'site' # <<< 42613 1727204573.83536: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 42613 1727204573.83964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 42613 1727204573.84000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204573.84021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 42613 1727204573.84067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 42613 1727204573.84089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 42613 1727204573.84134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 42613 1727204573.84138: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f6bf50> <<< 42613 1727204573.84177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 42613 1727204573.84206: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f800e0> <<< 42613 1727204573.84222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 42613 1727204573.84287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 42613 1727204573.84299: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 42613 1727204573.84483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204573.84506: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fa3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fa3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f83c20> import '_functools' # <<< 42613 1727204573.84540: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f81340> <<< 42613 1727204573.84638: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f69100> <<< 42613 1727204573.84667: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 42613 1727204573.84740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 42613 1727204573.84762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 42613 1727204573.84843: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 42613 1727204573.84946: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fc78c0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fc64e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f821e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fc4d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff4950> <<< 42613 1727204573.84970: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f68380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 42613 1727204573.85009: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258ff4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff4cb0> <<< 42613 1727204573.85186: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258ff5070> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f66ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff5730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff5400> import 'importlib.machinery' # <<< 42613 1727204573.85224: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 42613 1727204573.85251: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff6600> import 'importlib.util' # <<< 42613 1727204573.85270: stdout chunk (state=3): >>>import 'runpy' # <<< 42613 1727204573.85444: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 42613 1727204573.85460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259014830> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb259015ee0> <<< 42613 1727204573.85489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 42613 1727204573.85544: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259016d80> <<< 42613 1727204573.85590: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2590173b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2590162d0> <<< 42613 1727204573.85606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 42613 1727204573.85775: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb259017e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259017560> <<< 42613 1727204573.85791: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff6660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 42613 1727204573.85847: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d4bd70> <<< 42613 1727204573.85911: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d74860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d745c0> <<< 42613 1727204573.85977: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d74890> <<< 42613 1727204573.85999: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d74a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d49f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 42613 1727204573.86130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 42613 1727204573.86163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 42613 1727204573.86181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d760f0> <<< 42613 1727204573.86204: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d74d70> <<< 42613 1727204573.86422: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff6d50> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d9e480> <<< 42613 1727204573.86458: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 42613 1727204573.86496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 42613 1727204573.86515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 42613 1727204573.86575: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258dba630> <<< 42613 1727204573.86611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 42613 1727204573.86926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258def3e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 42613 1727204573.86957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 42613 1727204573.87103: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258e19b80> <<< 42613 1727204573.87240: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258def500> <<< 42613 1727204573.87317: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258dbb2c0> <<< 42613 1727204573.87361: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 42613 1727204573.87380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 42613 1727204573.87404: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258bf4500> <<< 42613 1727204573.87423: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258db9670> <<< 42613 1727204573.87426: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d77020> <<< 42613 1727204573.87789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 42613 1727204573.87800: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb258db9400> <<< 42613 1727204573.88069: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_eefrh5_e/ansible_setup_payload.zip' <<< 42613 1727204573.88080: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.88368: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.88401: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 42613 1727204573.88607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 42613 1727204573.88617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 42613 1727204573.88674: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 42613 1727204573.88679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 42613 1727204573.88697: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c621e0> <<< 42613 1727204573.88714: stdout chunk (state=3): >>>import '_typing' # <<< 42613 1727204573.89108: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c390d0> <<< 42613 1727204573.89144: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c38230> # zipimport: zlib available import 'ansible' # <<< 42613 1727204573.89156: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.89188: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.89239: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.89276: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 42613 1727204573.91112: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.93328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c3b5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258c91be0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c91970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c91280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c916d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c62c00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.93352: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258c92960> <<< 42613 1727204573.93373: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258c92ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 42613 1727204573.93503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 42613 1727204573.93535: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c930b0> import 'pwd' # <<< 42613 1727204573.93558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 42613 1727204573.93763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258af8e60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258afaa80> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 42613 1727204573.93777: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afb3b0> <<< 42613 1727204573.93814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 42613 1727204573.93854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 42613 1727204573.93881: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afc590> <<< 42613 1727204573.93904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 42613 1727204573.94111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258aff050> <<< 42613 1727204573.94157: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.94191: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258aff110> <<< 42613 1727204573.94229: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afd310><<< 42613 1727204573.94247: stdout chunk (state=3): >>> <<< 42613 1727204573.94284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 42613 1727204573.94323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 42613 1727204573.94363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 42613 1727204573.94387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 42613 1727204573.94421: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 42613 1727204573.94484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 42613 1727204573.94516: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 42613 1727204573.94538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 42613 1727204573.94586: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b02ea0> import '_tokenize' # <<< 42613 1727204573.94597: stdout chunk (state=3): >>> <<< 42613 1727204573.94710: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b01970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b016d0><<< 42613 1727204573.94751: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 42613 1727204573.94773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 42613 1727204573.95091: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b03e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afd7c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b47050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b47290> <<< 42613 1727204573.95140: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b4cd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b4cb30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 42613 1727204573.95451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b4f200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b4d400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 42613 1727204573.95455: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b56990> <<< 42613 1727204573.95598: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b4f320> <<< 42613 1727204573.95684: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.95716: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b57c80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.95759: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b579e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.95940: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b57380> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b47470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 42613 1727204573.95958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.95980: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b5b4a0> <<< 42613 1727204573.96110: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.96113: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 42613 1727204573.96200: stdout chunk (state=3): >>> import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b5c740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b59c10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204573.96204: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b5afc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b59820> <<< 42613 1727204573.96281: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 42613 1727204573.96313: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.96550: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 42613 1727204573.96554: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.96569: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 42613 1727204573.96609: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.97007: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.97728: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.98728: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 42613 1727204573.98761: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204573.98910: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589e4890> <<< 42613 1727204573.99000: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589e56a0> <<< 42613 1727204573.99056: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b5fe30> <<< 42613 1727204573.99081: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 42613 1727204573.99159: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 42613 1727204573.99445: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204573.99732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 42613 1727204573.99751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 42613 1727204573.99779: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589e55e0> <<< 42613 1727204573.99914: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.00778: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.01814: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.01818: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.01943: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 42613 1727204574.01970: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.02030: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.02085: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 42613 1727204574.02187: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.02288: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.02503: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.02545: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 42613 1727204574.02685: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.03039: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.03496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 42613 1727204574.03610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 42613 1727204574.03631: stdout chunk (state=3): >>>import '_ast' # <<< 42613 1727204574.03757: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589e64e0> <<< 42613 1727204574.03783: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.03913: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.04038: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 42613 1727204574.04058: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 42613 1727204574.04123: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 42613 1727204574.04141: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 42613 1727204574.04376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 42613 1727204574.04396: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204574.04586: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589ee090><<< 42613 1727204574.04615: stdout chunk (state=3): >>> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589eea20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b58ce0> # zipimport: zlib available <<< 42613 1727204574.04649: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.04702: stdout chunk (state=3): >>> <<< 42613 1727204574.04725: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 42613 1727204574.04746: stdout chunk (state=3): >>> <<< 42613 1727204574.04769: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.04885: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.04929: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204574.04944: stdout chunk (state=3): >>> <<< 42613 1727204574.05039: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.05381: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 42613 1727204574.05404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.05421: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 42613 1727204574.05512: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589ed850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589eebd0> <<< 42613 1727204574.05561: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 42613 1727204574.05585: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 42613 1727204574.05616: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.05737: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.05886: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 42613 1727204574.05906: stdout chunk (state=3): >>> <<< 42613 1727204574.05963: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 42613 1727204574.06020: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 42613 1727204574.06039: stdout chunk (state=3): >>> <<< 42613 1727204574.06068: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 42613 1727204574.06086: stdout chunk (state=3): >>> <<< 42613 1727204574.06211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 42613 1727204574.06249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 42613 1727204574.06285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 42613 1727204574.06504: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a82d20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589f8a70><<< 42613 1727204574.06507: stdout chunk (state=3): >>> <<< 42613 1727204574.06632: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589f6b70><<< 42613 1727204574.06649: stdout chunk (state=3): >>> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589f69c0> # destroy ansible.module_utils.distro<<< 42613 1727204574.06673: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 42613 1727204574.06727: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204574.06745: stdout chunk (state=3): >>> <<< 42613 1727204574.06774: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 42613 1727204574.06799: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # <<< 42613 1727204574.06818: stdout chunk (state=3): >>> <<< 42613 1727204574.06921: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available<<< 42613 1727204574.06942: stdout chunk (state=3): >>> <<< 42613 1727204574.06957: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 42613 1727204574.06992: stdout chunk (state=3): >>> # zipimport: zlib available <<< 42613 1727204574.07201: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 42613 1727204574.07226: stdout chunk (state=3): >>> <<< 42613 1727204574.07247: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.07281: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.07485: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.07511: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.07530: stdout chunk (state=3): >>> <<< 42613 1727204574.07580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 42613 1727204574.07617: stdout chunk (state=3): >>> # zipimport: zlib available <<< 42613 1727204574.07770: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.07908: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204574.07926: stdout chunk (state=3): >>> <<< 42613 1727204574.07951: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.07970: stdout chunk (state=3): >>> <<< 42613 1727204574.08023: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 42613 1727204574.08045: stdout chunk (state=3): >>> <<< 42613 1727204574.08063: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.08401: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.08481: stdout chunk (state=3): >>> <<< 42613 1727204574.08735: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.08780: stdout chunk (state=3): >>> <<< 42613 1727204574.08887: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.08911: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 42613 1727204574.08929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.08997: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc'<<< 42613 1727204574.09008: stdout chunk (state=3): >>> <<< 42613 1727204574.09026: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py<<< 42613 1727204574.09081: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 42613 1727204574.09129: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a85be0> <<< 42613 1727204574.09310: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 42613 1727204574.09314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 42613 1727204574.09344: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py<<< 42613 1727204574.09370: stdout chunk (state=3): >>> <<< 42613 1727204574.09388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 42613 1727204574.09414: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4c3b0><<< 42613 1727204574.09434: stdout chunk (state=3): >>> <<< 42613 1727204574.09474: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 42613 1727204574.09507: stdout chunk (state=3): >>> # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204574.09607: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257f4c710> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a65400> <<< 42613 1727204574.09645: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a64500><<< 42613 1727204574.09658: stdout chunk (state=3): >>> <<< 42613 1727204574.09710: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a84380> <<< 42613 1727204574.09908: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a84c80><<< 42613 1727204574.09915: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 42613 1727204574.09938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 42613 1727204574.09975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py<<< 42613 1727204574.09999: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 42613 1727204574.10020: stdout chunk (state=3): >>> <<< 42613 1727204574.10055: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257f4f6b0><<< 42613 1727204574.10105: stdout chunk (state=3): >>> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4ef60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 42613 1727204574.10162: stdout chunk (state=3): >>> # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257f4f140> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4e390> <<< 42613 1727204574.10286: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 42613 1727204574.10424: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 42613 1727204574.10443: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4f770> <<< 42613 1727204574.10516: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 42613 1727204574.10540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 42613 1727204574.10633: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257fb62a0> <<< 42613 1727204574.10687: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fb42c0> <<< 42613 1727204574.10745: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a87e00><<< 42613 1727204574.10768: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.timeout' # <<< 42613 1727204574.10882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 42613 1727204574.10933: stdout chunk (state=3): >>> # zipimport: zlib available <<< 42613 1727204574.11030: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.11120: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other.facter' # <<< 42613 1727204574.11149: stdout chunk (state=3): >>> <<< 42613 1727204574.11169: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.11283: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.11366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 42613 1727204574.11379: stdout chunk (state=3): >>> <<< 42613 1727204574.11473: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 42613 1727204574.11600: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 42613 1727204574.11654: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.11753: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.caps' # <<< 42613 1727204574.11799: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.11843: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.11849: stdout chunk (state=3): >>> <<< 42613 1727204574.11947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 42613 1727204574.11951: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.12259: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.12262: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.12395: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 42613 1727204574.12417: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.12423: stdout chunk (state=3): >>> <<< 42613 1727204574.13466: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.13884: stdout chunk (state=3): >>> <<< 42613 1727204574.14500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.14540: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.14548: stdout chunk (state=3): >>> <<< 42613 1727204574.14591: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 42613 1727204574.14611: stdout chunk (state=3): >>> <<< 42613 1727204574.14628: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 42613 1727204574.14647: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.14660: stdout chunk (state=3): >>> <<< 42613 1727204574.14707: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.14759: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.env' # <<< 42613 1727204574.14785: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.14795: stdout chunk (state=3): >>> <<< 42613 1727204574.15194: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available<<< 42613 1727204574.15228: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.loadavg' # <<< 42613 1727204574.15288: stdout chunk (state=3): >>> <<< 42613 1727204574.15291: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.15546: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py<<< 42613 1727204574.15556: stdout chunk (state=3): >>> <<< 42613 1727204574.15574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 42613 1727204574.15620: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fb77a0><<< 42613 1727204574.15639: stdout chunk (state=3): >>> <<< 42613 1727204574.15664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py<<< 42613 1727204574.15674: stdout chunk (state=3): >>> <<< 42613 1727204574.15904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 42613 1727204574.16290: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fb6ea0><<< 42613 1727204574.16294: stdout chunk (state=3): >>> <<< 42613 1727204574.16296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 42613 1727204574.16299: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.16301: stdout chunk (state=3): >>> <<< 42613 1727204574.16303: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 42613 1727204574.16592: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 42613 1727204574.16623: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.16743: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.16755: stdout chunk (state=3): >>> <<< 42613 1727204574.16893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available<<< 42613 1727204574.16905: stdout chunk (state=3): >>> <<< 42613 1727204574.17005: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.17122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py<<< 42613 1727204574.17127: stdout chunk (state=3): >>> <<< 42613 1727204574.17286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 42613 1727204574.17384: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 42613 1727204574.17399: stdout chunk (state=3): >>> import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257fea420> <<< 42613 1727204574.17834: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fd3110> <<< 42613 1727204574.17838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 42613 1727204574.17841: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.18048: stdout chunk (state=3): >>> # zipimport: zlib available <<< 42613 1727204574.18053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 42613 1727204574.18056: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.18376: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204574.18380: stdout chunk (state=3): >>> <<< 42613 1727204574.18382: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.18385: stdout chunk (state=3): >>> <<< 42613 1727204574.18554: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.18819: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 42613 1727204574.18837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 42613 1727204574.18851: stdout chunk (state=3): >>> <<< 42613 1727204574.18872: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.19189: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.19202: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py<<< 42613 1727204574.19220: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc'<<< 42613 1727204574.19275: stdout chunk (state=3): >>> # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 42613 1727204574.19284: stdout chunk (state=3): >>> <<< 42613 1727204574.19474: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 42613 1727204574.19478: stdout chunk (state=3): >>> import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257e05d60> <<< 42613 1727204574.19481: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257e05a30> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 42613 1727204574.19484: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.19486: stdout chunk (state=3): >>> <<< 42613 1727204574.19541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 42613 1727204574.19558: stdout chunk (state=3): >>> <<< 42613 1727204574.19577: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.20229: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.20398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.20581: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.20585: stdout chunk (state=3): >>> <<< 42613 1727204574.20629: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.20643: stdout chunk (state=3): >>> <<< 42613 1727204574.20698: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 42613 1727204574.20715: stdout chunk (state=3): >>> <<< 42613 1727204574.20731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 42613 1727204574.20752: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.20770: stdout chunk (state=3): >>> <<< 42613 1727204574.20796: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.20991: stdout chunk (state=3): >>> # zipimport: zlib available <<< 42613 1727204574.21120: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.21482: stdout chunk (state=3): >>> <<< 42613 1727204574.21502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 42613 1727204574.21674: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.21753: stdout chunk (state=3): >>> <<< 42613 1727204574.21989: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.22038: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.22053: stdout chunk (state=3): >>> <<< 42613 1727204574.23246: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.24279: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 42613 1727204574.24283: stdout chunk (state=3): >>> # zipimport: zlib available <<< 42613 1727204574.24411: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.24493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 42613 1727204574.24511: stdout chunk (state=3): >>> <<< 42613 1727204574.24524: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.24536: stdout chunk (state=3): >>> <<< 42613 1727204574.24711: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.24722: stdout chunk (state=3): >>> <<< 42613 1727204574.24892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 42613 1727204574.24925: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.25219: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.25684: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available<<< 42613 1727204574.25688: stdout chunk (state=3): >>> <<< 42613 1727204574.25692: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.25694: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 42613 1727204574.25697: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.25737: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.base' # <<< 42613 1727204574.25791: stdout chunk (state=3): >>> # zipimport: zlib available <<< 42613 1727204574.26029: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.26144: stdout chunk (state=3): >>> <<< 42613 1727204574.26149: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.26152: stdout chunk (state=3): >>> <<< 42613 1727204574.26605: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.26924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 42613 1727204574.26944: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.aix' # <<< 42613 1727204574.26980: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.27050: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.27194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available<<< 42613 1727204574.27213: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 42613 1727204574.27239: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204574.27243: stdout chunk (state=3): >>> <<< 42613 1727204574.27406: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.27475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 42613 1727204574.27626: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.27643: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 42613 1727204574.27796: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 42613 1727204574.27916: stdout chunk (state=3): >>> <<< 42613 1727204574.27946: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.27961: stdout chunk (state=3): >>> <<< 42613 1727204574.28041: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.28619: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 42613 1727204574.29074: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.29207: stdout chunk (state=3): >>> <<< 42613 1727204574.29577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 42613 1727204574.29604: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.29618: stdout chunk (state=3): >>> <<< 42613 1727204574.29718: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.29823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 42613 1727204574.29850: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.29973: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.30051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 42613 1727204574.30054: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204574.30057: stdout chunk (state=3): >>> <<< 42613 1727204574.30290: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 42613 1727204574.30471: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.30555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available<<< 42613 1727204574.30575: stdout chunk (state=3): >>> <<< 42613 1727204574.30587: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 42613 1727204574.30692: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.31138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 42613 1727204574.31142: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31144: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31146: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31148: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31150: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 42613 1727204574.31155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 42613 1727204574.31157: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 42613 1727204574.31514: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204574.31758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 42613 1727204574.31775: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31821: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 42613 1727204574.31880: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.31985: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.32067: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 42613 1727204574.32074: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.32263: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.32283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 42613 1727204574.32377: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.32562: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 42613 1727204574.32589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 42613 1727204574.32651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 42613 1727204574.32655: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257e2e8d0> <<< 42613 1727204574.32658: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257e2c9e0> <<< 42613 1727204574.32721: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257e2bf80> <<< 42613 1727204574.34339: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "54", "epoch": "1727204574", "epoch_int": "1727204574", "date": "2024-09-24", "time": "15:02:54", "iso8601_micro": "2024-09-24T19:02:54.333283Z", "iso8601": "2024-09-24T19:02:54Z", "iso8601_basic": "20240924T150254333283", "iso8601_basic_short": "20240924T150254", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2d<<< 42613 1727204574.34494: stdout chunk (state=3): >>>FsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 42613 1727204574.35547: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible <<< 42613 1727204574.35552: stdout chunk (state=3): >>># destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token <<< 42613 1727204574.35640: stdout chunk (state=3): >>># destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap <<< 42613 1727204574.35895: stdout chunk (state=3): >>># cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd<<< 42613 1727204574.35950: stdout chunk (state=3): >>> # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiproc<<< 42613 1727204574.35994: stdout chunk (state=3): >>>essing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 42613 1727204574.36562: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 42613 1727204574.36585: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 42613 1727204574.36673: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 42613 1727204574.36677: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 42613 1727204574.36718: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array <<< 42613 1727204574.36740: stdout chunk (state=3): >>># destroy _compat_pickle # destroy _pickle <<< 42613 1727204574.36770: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 42613 1727204574.36787: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 42613 1727204574.36973: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 42613 1727204574.36977: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 42613 1727204574.36979: stdout chunk (state=3): >>># destroy _ssl <<< 42613 1727204574.36984: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 42613 1727204574.36987: stdout chunk (state=3): >>># destroy errno # destroy json <<< 42613 1727204574.36993: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 42613 1727204574.36996: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 42613 1727204574.37006: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 42613 1727204574.37056: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform<<< 42613 1727204574.37194: stdout chunk (state=3): >>> # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 42613 1727204574.37202: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 42613 1727204574.37209: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 42613 1727204574.37413: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 42613 1727204574.37716: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 42613 1727204574.37727: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 42613 1727204574.37754: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 42613 1727204574.37788: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 42613 1727204574.37829: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 42613 1727204574.37838: stdout chunk (state=3): >>> # clear sys.audit hooks <<< 42613 1727204574.38722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204574.38726: stderr chunk (state=3): >>><<< 42613 1727204574.38728: stdout chunk (state=3): >>><<< 42613 1727204574.38987: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259118530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2590e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb25911aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f2d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f2e090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f6bf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f800e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fa3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fa3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f83c20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f81340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f69100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fc78c0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fc64e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f821e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258fc4d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f68380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258ff4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff4cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258ff5070> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258f66ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff5730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff5400> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff6600> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259014830> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb259015ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259016d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2590173b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2590162d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb259017e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb259017560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff6660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d4bd70> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d74860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d745c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d74890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258d74a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d49f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d760f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d74d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258ff6d50> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d9e480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258dba630> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258def3e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258e19b80> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258def500> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258dbb2c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258bf4500> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258db9670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258d77020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb258db9400> # zipimport: found 103 names in '/tmp/ansible_setup_payload_eefrh5_e/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c621e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c390d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c38230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c3b5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258c91be0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c91970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c91280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c916d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c62c00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258c92960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258c92ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258c930b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258af8e60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258afaa80> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afb3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afc590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258aff050> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258aff110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afd310> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b02ea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b01970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b016d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b03e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258afd7c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b47050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b47290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b4cd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b4cb30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b4f200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b4d400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b56990> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b4f320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b57c80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b579e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b57380> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b47470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b5b4a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b5c740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b59c10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb258b5afc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b59820> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589e4890> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589e56a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b5fe30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589e55e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589e64e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589ee090> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589eea20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258b58ce0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb2589ed850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589eebd0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a82d20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589f8a70> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589f6b70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb2589f69c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a85be0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4c3b0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257f4c710> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a65400> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a64500> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a84380> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a84c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257f4f6b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4ef60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257f4f140> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4e390> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257f4f770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257fb62a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fb42c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb258a87e00> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fb77a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fb6ea0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257fea420> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257fd3110> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257e05d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257e05a30> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb257e2e8d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257e2c9e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb257e2bf80> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "54", "epoch": "1727204574", "epoch_int": "1727204574", "date": "2024-09-24", "time": "15:02:54", "iso8601_micro": "2024-09-24T19:02:54.333283Z", "iso8601": "2024-09-24T19:02:54Z", "iso8601_basic": "20240924T150254333283", "iso8601_basic_short": "20240924T150254", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 42613 1727204574.41884: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204574.41898: _low_level_execute_command(): starting 42613 1727204574.41901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204573.657051-42761-267739743121753/ > /dev/null 2>&1 && sleep 0' 42613 1727204574.41904: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204574.41907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204574.41909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204574.41912: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204574.42129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204574.42222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204574.42343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204574.44898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204574.44918: stderr chunk (state=3): >>><<< 42613 1727204574.44988: stdout chunk (state=3): >>><<< 42613 1727204574.45006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204574.45013: handler run complete 42613 1727204574.45269: variable 'ansible_facts' from source: unknown 42613 1727204574.45274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204574.45466: variable 'ansible_facts' from source: unknown 42613 1727204574.45898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204574.45969: attempt loop complete, returning result 42613 1727204574.45973: _execute() done 42613 1727204574.45976: dumping result to json 42613 1727204574.45990: done dumping result, returning 42613 1727204574.46000: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-2f91-05d8-0000000000c0] 42613 1727204574.46007: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c0 42613 1727204574.46244: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c0 42613 1727204574.46247: WORKER PROCESS EXITING ok: [managed-node3] 42613 1727204574.46496: no more pending results, returning what we have 42613 1727204574.46500: results queue empty 42613 1727204574.46501: checking for any_errors_fatal 42613 1727204574.46503: done checking for any_errors_fatal 42613 1727204574.46504: checking for max_fail_percentage 42613 1727204574.46506: done checking for max_fail_percentage 42613 1727204574.46507: checking to see if all hosts have failed and the running result is not ok 42613 1727204574.46507: done checking to see if all hosts have failed 42613 1727204574.46508: getting the remaining hosts for this loop 42613 1727204574.46509: done getting the remaining hosts for this loop 42613 1727204574.46514: getting the next task for host managed-node3 42613 1727204574.46522: done getting next task for host managed-node3 42613 1727204574.46525: ^ task is: TASK: Check if system is ostree 42613 1727204574.46529: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204574.46533: getting variables 42613 1727204574.46535: in VariableManager get_vars() 42613 1727204574.46563: Calling all_inventory to load vars for managed-node3 42613 1727204574.46568: Calling groups_inventory to load vars for managed-node3 42613 1727204574.46572: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204574.46584: Calling all_plugins_play to load vars for managed-node3 42613 1727204574.46587: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204574.46591: Calling groups_plugins_play to load vars for managed-node3 42613 1727204574.46880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204574.47113: done with get_vars() 42613 1727204574.47126: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:02:54 -0400 (0:00:00.916) 0:00:03.080 ***** 42613 1727204574.47229: entering _queue_task() for managed-node3/stat 42613 1727204574.47653: worker is 1 (out of 1 available) 42613 1727204574.47679: exiting _queue_task() for managed-node3/stat 42613 1727204574.47691: done queuing things up, now waiting for results queue to drain 42613 1727204574.47693: waiting for pending results... 42613 1727204574.48162: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 42613 1727204574.48174: in run() - task 127b8e07-fff9-2f91-05d8-0000000000c2 42613 1727204574.48177: variable 'ansible_search_path' from source: unknown 42613 1727204574.48180: variable 'ansible_search_path' from source: unknown 42613 1727204574.48655: calling self._execute() 42613 1727204574.48697: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204574.48702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204574.48715: variable 'omit' from source: magic vars 42613 1727204574.49587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204574.49889: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204574.49991: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204574.50039: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204574.50081: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204574.50289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204574.50293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204574.50309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204574.50340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204574.50487: Evaluated conditional (not __network_is_ostree is defined): True 42613 1727204574.50499: variable 'omit' from source: magic vars 42613 1727204574.50544: variable 'omit' from source: magic vars 42613 1727204574.50594: variable 'omit' from source: magic vars 42613 1727204574.50890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204574.50894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204574.50897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204574.51078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204574.51082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204574.51084: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204574.51086: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204574.51089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204574.51412: Set connection var ansible_shell_executable to /bin/sh 42613 1727204574.51426: Set connection var ansible_pipelining to False 42613 1727204574.51468: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204574.51477: Set connection var ansible_connection to ssh 42613 1727204574.51673: Set connection var ansible_timeout to 10 42613 1727204574.51676: Set connection var ansible_shell_type to sh 42613 1727204574.51679: variable 'ansible_shell_executable' from source: unknown 42613 1727204574.51681: variable 'ansible_connection' from source: unknown 42613 1727204574.51684: variable 'ansible_module_compression' from source: unknown 42613 1727204574.51686: variable 'ansible_shell_type' from source: unknown 42613 1727204574.51688: variable 'ansible_shell_executable' from source: unknown 42613 1727204574.51690: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204574.51692: variable 'ansible_pipelining' from source: unknown 42613 1727204574.51694: variable 'ansible_timeout' from source: unknown 42613 1727204574.51696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204574.52048: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204574.52070: variable 'omit' from source: magic vars 42613 1727204574.52220: starting attempt loop 42613 1727204574.52223: running the handler 42613 1727204574.52226: _low_level_execute_command(): starting 42613 1727204574.52229: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204574.53785: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204574.53882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204574.55687: stdout chunk (state=3): >>>/root <<< 42613 1727204574.55850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204574.55872: stdout chunk (state=3): >>><<< 42613 1727204574.55895: stderr chunk (state=3): >>><<< 42613 1727204574.55929: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204574.55957: _low_level_execute_command(): starting 42613 1727204574.55971: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536 `" && echo ansible-tmp-1727204574.5594425-42805-217186559099536="` echo /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536 `" ) && sleep 0' 42613 1727204574.56654: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204574.56674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204574.56691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204574.56717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204574.56813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204574.56842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204574.56952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204574.59110: stdout chunk (state=3): >>>ansible-tmp-1727204574.5594425-42805-217186559099536=/root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536 <<< 42613 1727204574.59340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204574.59344: stdout chunk (state=3): >>><<< 42613 1727204574.59346: stderr chunk (state=3): >>><<< 42613 1727204574.59368: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204574.5594425-42805-217186559099536=/root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204574.59447: variable 'ansible_module_compression' from source: unknown 42613 1727204574.59555: ANSIBALLZ: Using lock for stat 42613 1727204574.59559: ANSIBALLZ: Acquiring lock 42613 1727204574.59561: ANSIBALLZ: Lock acquired: 139982757272880 42613 1727204574.59563: ANSIBALLZ: Creating module 42613 1727204574.77759: ANSIBALLZ: Writing module into payload 42613 1727204574.77763: ANSIBALLZ: Writing module 42613 1727204574.77767: ANSIBALLZ: Renaming module 42613 1727204574.77770: ANSIBALLZ: Done creating module 42613 1727204574.77772: variable 'ansible_facts' from source: unknown 42613 1727204574.77908: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/AnsiballZ_stat.py 42613 1727204574.78298: Sending initial data 42613 1727204574.78302: Sent initial data (153 bytes) 42613 1727204574.79562: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204574.79668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204574.79675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204574.79679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204574.79707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204574.79749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204574.79820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204574.82555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204574.82560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204574.82688: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpgfle9s82 /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/AnsiballZ_stat.py <<< 42613 1727204574.82692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/AnsiballZ_stat.py" <<< 42613 1727204574.82719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpgfle9s82" to remote "/root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/AnsiballZ_stat.py" <<< 42613 1727204574.84245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204574.84337: stderr chunk (state=3): >>><<< 42613 1727204574.84347: stdout chunk (state=3): >>><<< 42613 1727204574.84372: done transferring module to remote 42613 1727204574.84392: _low_level_execute_command(): starting 42613 1727204574.84400: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/ /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/AnsiballZ_stat.py && sleep 0' 42613 1727204574.85122: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204574.85171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204574.85174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204574.85177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204574.85180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204574.85183: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204574.85185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204574.85198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204574.85205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204574.85216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204574.85219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204574.85327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204574.85334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204574.85338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204574.85340: stderr chunk (state=3): >>>debug2: match found <<< 42613 1727204574.85342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204574.85345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204574.85361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204574.85369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204574.85487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204574.88312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204574.88356: stderr chunk (state=3): >>><<< 42613 1727204574.88371: stdout chunk (state=3): >>><<< 42613 1727204574.88401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 42613 1727204574.88423: _low_level_execute_command(): starting 42613 1727204574.88511: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/AnsiballZ_stat.py && sleep 0' 42613 1727204574.89278: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204574.89312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204574.89349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204574.89429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204574.93222: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 42613 1727204574.93246: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 42613 1727204574.93333: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 42613 1727204574.93428: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 42613 1727204574.93456: stdout chunk (state=3): >>>import 'time' # <<< 42613 1727204574.93477: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 42613 1727204574.93574: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 42613 1727204574.93596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 42613 1727204574.93613: stdout chunk (state=3): >>>import 'codecs' # <<< 42613 1727204574.93658: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 42613 1727204574.93693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c09fc530> <<< 42613 1727204574.93750: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c09cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c09feab0> <<< 42613 1727204574.93781: stdout chunk (state=3): >>>import '_signal' # <<< 42613 1727204574.93908: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 42613 1727204574.94027: stdout chunk (state=3): >>>import '_collections_abc' # <<< 42613 1727204574.94074: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 42613 1727204574.94128: stdout chunk (state=3): >>>import 'os' # <<< 42613 1727204574.94131: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 42613 1727204574.94186: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 42613 1727204574.94190: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 42613 1727204574.94221: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 42613 1727204574.94224: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 42613 1727204574.94261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08111c0> <<< 42613 1727204574.94360: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 42613 1727204574.94363: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.94413: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08120c0> import 'site' # <<< 42613 1727204574.94452: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 42613 1727204574.94882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 42613 1727204574.94913: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 42613 1727204574.94918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.95010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 42613 1727204574.95023: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 42613 1727204574.95057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 42613 1727204574.95090: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084ffb0> <<< 42613 1727204574.95094: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 42613 1727204574.95121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 42613 1727204574.95161: stdout chunk (state=3): >>>import '_operator' # <<< 42613 1727204574.95164: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0864140> <<< 42613 1727204574.95218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 42613 1727204574.95222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 42613 1727204574.95250: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 42613 1727204574.95323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.95381: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0887950> <<< 42613 1727204574.95414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0887fe0> <<< 42613 1727204574.95512: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0867c20> <<< 42613 1727204574.95523: stdout chunk (state=3): >>>import '_functools' # <<< 42613 1727204574.95553: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08653a0> <<< 42613 1727204574.95704: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084d160> <<< 42613 1727204574.95741: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 42613 1727204574.95761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 42613 1727204574.95782: stdout chunk (state=3): >>>import '_sre' # <<< 42613 1727204574.95802: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 42613 1727204574.95871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 42613 1727204574.95874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 42613 1727204574.95950: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08ab8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08aa510> <<< 42613 1727204574.95970: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 42613 1727204574.96009: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0866240> <<< 42613 1727204574.96022: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08a8d70> <<< 42613 1727204574.96097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 42613 1727204574.96101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d8980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084c3e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 42613 1727204574.96142: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204574.96159: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08d8e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d8ce0> <<< 42613 1727204574.96201: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08d90d0> <<< 42613 1727204574.96245: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084af00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.96277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 42613 1727204574.96327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 42613 1727204574.96345: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d97c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d9490> <<< 42613 1727204574.96403: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 42613 1727204574.96425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08da6c0> <<< 42613 1727204574.96450: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 42613 1727204574.96479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 42613 1727204574.96629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f48c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08f5fd0> <<< 42613 1727204574.96633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 42613 1727204574.96678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 42613 1727204574.96681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 42613 1727204574.96696: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f6e40> <<< 42613 1727204574.96756: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08f7470> <<< 42613 1727204574.96763: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f6390> <<< 42613 1727204574.96785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 42613 1727204574.96842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 42613 1727204574.96870: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08f7e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f7590> <<< 42613 1727204574.96943: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08da6f0> <<< 42613 1727204574.96957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 42613 1727204574.97004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 42613 1727204574.97052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 42613 1727204574.97089: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06cbce0> <<< 42613 1727204574.97156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06f8860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06f85c0> <<< 42613 1727204574.97188: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06f8800> <<< 42613 1727204574.97244: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06f89e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06c9e80> <<< 42613 1727204574.97270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 42613 1727204574.97445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 42613 1727204574.97490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 42613 1727204574.97504: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06fa090> <<< 42613 1727204574.97554: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06f8d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08dade0> <<< 42613 1727204574.97585: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 42613 1727204574.97670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.97749: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 42613 1727204574.97790: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c07223c0> <<< 42613 1727204574.97855: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 42613 1727204574.97890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204574.97900: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 42613 1727204574.97938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 42613 1727204574.98006: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c073a510> <<< 42613 1727204574.98036: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 42613 1727204574.98090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 42613 1727204574.98224: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c07772c0> <<< 42613 1727204574.98246: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 42613 1727204574.98337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 42613 1727204574.98392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 42613 1727204574.98546: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0799a60> <<< 42613 1727204574.98664: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c07773e0> <<< 42613 1727204574.98723: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c073b170> <<< 42613 1727204574.98782: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0578440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0739550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06faf90> <<< 42613 1727204574.98961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f18c05786e0> <<< 42613 1727204574.99085: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_od32s5cf/ansible_stat_payload.zip' # zipimport: zlib available <<< 42613 1727204574.99351: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204574.99510: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 42613 1727204574.99584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 42613 1727204574.99660: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05d2240> <<< 42613 1727204574.99672: stdout chunk (state=3): >>>import '_typing' # <<< 42613 1727204574.99983: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05a9130> <<< 42613 1727204575.00004: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05a8290> # zipimport: zlib available <<< 42613 1727204575.00039: stdout chunk (state=3): >>>import 'ansible' # <<< 42613 1727204575.00067: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.00071: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.00101: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 42613 1727204575.00125: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.02782: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.05101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05ab230> <<< 42613 1727204575.05139: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204575.05164: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 42613 1727204575.05242: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05fdca0> <<< 42613 1727204575.05293: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05fda30> <<< 42613 1727204575.05346: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05fd340> <<< 42613 1727204575.05376: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 42613 1727204575.05448: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05fd790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05d2c60> import 'atexit' # <<< 42613 1727204575.05517: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05fe960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.05535: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05feba0> <<< 42613 1727204575.05558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 42613 1727204575.05631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 42613 1727204575.05721: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05ff080> <<< 42613 1727204575.05739: stdout chunk (state=3): >>>import 'pwd' # <<< 42613 1727204575.05756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 42613 1727204575.05798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 42613 1727204575.05850: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c045ce90> <<< 42613 1727204575.05896: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c045eab0> <<< 42613 1727204575.05920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 42613 1727204575.05941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 42613 1727204575.06024: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c045f440> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 42613 1727204575.06059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 42613 1727204575.06085: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0460620> <<< 42613 1727204575.06163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 42613 1727204575.06189: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 42613 1727204575.06286: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0463110> <<< 42613 1727204575.06347: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0463230> <<< 42613 1727204575.06368: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04613d0> <<< 42613 1727204575.06387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 42613 1727204575.06463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 42613 1727204575.06487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 42613 1727204575.06577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 42613 1727204575.06596: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0467050> <<< 42613 1727204575.06605: stdout chunk (state=3): >>>import '_tokenize' # <<< 42613 1727204575.06733: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0465b20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0465880> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 42613 1727204575.06745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 42613 1727204575.06872: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0465df0> <<< 42613 1727204575.06945: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04618e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04af200> <<< 42613 1727204575.06988: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04af380> <<< 42613 1727204575.07019: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 42613 1727204575.07073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 42613 1727204575.07134: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04b0f80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04b0d40> <<< 42613 1727204575.07309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 42613 1727204575.07359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 42613 1727204575.07445: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.07471: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04b3410> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04b1640> <<< 42613 1727204575.07510: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 42613 1727204575.07588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204575.07623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 42613 1727204575.07644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 42613 1727204575.07665: stdout chunk (state=3): >>>import '_string' # <<< 42613 1727204575.07745: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04bac30> <<< 42613 1727204575.07981: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04b35c0> <<< 42613 1727204575.08106: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.08131: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.08143: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bba40> <<< 42613 1727204575.08174: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.08318: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bba70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bbda0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04af680> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 42613 1727204575.08330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 42613 1727204575.08370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 42613 1727204575.08421: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.08469: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bf740><<< 42613 1727204575.08706: stdout chunk (state=3): >>> <<< 42613 1727204575.08790: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.08822: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04c06b0><<< 42613 1727204575.08854: stdout chunk (state=3): >>> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04bdeb0><<< 42613 1727204575.08864: stdout chunk (state=3): >>> <<< 42613 1727204575.08921: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bf230> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04bdac0><<< 42613 1727204575.08938: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204575.08971: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204575.08993: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 42613 1727204575.09024: stdout chunk (state=3): >>> # zipimport: zlib available<<< 42613 1727204575.09041: stdout chunk (state=3): >>> <<< 42613 1727204575.09184: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204575.09305: stdout chunk (state=3): >>> <<< 42613 1727204575.09361: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204575.09382: stdout chunk (state=3): >>> <<< 42613 1727204575.09402: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 42613 1727204575.09426: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204575.09470: stdout chunk (state=3): >>> <<< 42613 1727204575.09474: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.09499: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 42613 1727204575.09526: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204575.09538: stdout chunk (state=3): >>> <<< 42613 1727204575.09750: stdout chunk (state=3): >>># zipimport: zlib available<<< 42613 1727204575.09802: stdout chunk (state=3): >>> <<< 42613 1727204575.09990: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.11101: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.12191: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 42613 1727204575.12438: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0548860> <<< 42613 1727204575.12495: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 42613 1727204575.12498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 42613 1727204575.12556: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05496a0> <<< 42613 1727204575.12562: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04c30e0> <<< 42613 1727204575.12638: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 42613 1727204575.12664: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.12693: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.12720: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 42613 1727204575.12747: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.13049: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.13342: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 42613 1727204575.13362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 42613 1727204575.13372: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05496d0> <<< 42613 1727204575.13382: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.14310: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.15206: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.15323: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.15491: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 42613 1727204575.15494: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.15689: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204575.15837: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 42613 1727204575.15979: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 42613 1727204575.15995: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 42613 1727204575.16024: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.16659: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.16957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 42613 1727204575.17312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c054a4b0> # zipimport: zlib available <<< 42613 1727204575.17364: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.17493: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 42613 1727204575.17527: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 42613 1727204575.17555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 42613 1727204575.17678: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.17876: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0552300> <<< 42613 1727204575.17963: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0552c30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05a8f50> <<< 42613 1727204575.17992: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.18062: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.18173: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 42613 1727204575.18177: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.18210: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.18292: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.18379: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.18492: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 42613 1727204575.18579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204575.18700: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 42613 1727204575.18715: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05519d0> <<< 42613 1727204575.18764: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0552d80> <<< 42613 1727204575.18839: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 42613 1727204575.18862: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 42613 1727204575.19121: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 42613 1727204575.19217: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 42613 1727204575.19445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 42613 1727204575.19464: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c03e2f30> <<< 42613 1727204575.19784: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c035cc50> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c035ae40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c035aba0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 42613 1727204575.19838: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 42613 1727204575.19846: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.19870: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 42613 1727204575.19960: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.20350: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.20592: stdout chunk (state=3): >>># zipimport: zlib available <<< 42613 1727204575.20641: stdout chunk (state=3): >>> <<< 42613 1727204575.20703: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}<<< 42613 1727204575.20720: stdout chunk (state=3): >>> # destroy __main__ <<< 42613 1727204575.21374: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ <<< 42613 1727204575.21407: stdout chunk (state=3): >>># clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 42613 1727204575.21474: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external <<< 42613 1727204575.21575: stdout chunk (state=3): >>># cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil <<< 42613 1727204575.21601: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess<<< 42613 1727204575.21645: stdout chunk (state=3): >>> # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize <<< 42613 1727204575.21907: stdout chunk (state=3): >>># cleanup[2] removing tokenize # cleanup[2] removing linecache<<< 42613 1727204575.21939: stdout chunk (state=3): >>> # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 42613 1727204575.22172: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 42613 1727204575.22197: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 42613 1727204575.22277: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 42613 1727204575.22335: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 42613 1727204575.22428: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 42613 1727204575.22555: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon<<< 42613 1727204575.22611: stdout chunk (state=3): >>> # cleanup[3] wiping _socket <<< 42613 1727204575.22648: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix <<< 42613 1727204575.22776: stdout chunk (state=3): >>># destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 42613 1727204575.22802: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 42613 1727204575.22908: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 42613 1727204575.23015: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 42613 1727204575.23033: stdout chunk (state=3): >>># destroy _collections <<< 42613 1727204575.23189: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 42613 1727204575.23217: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 42613 1727204575.23317: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 42613 1727204575.23338: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 42613 1727204575.23375: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 42613 1727204575.23395: stdout chunk (state=3): >>># destroy _random <<< 42613 1727204575.23590: stdout chunk (state=3): >>># destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 42613 1727204575.24038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204575.24139: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 42613 1727204575.24262: stderr chunk (state=3): >>><<< 42613 1727204575.24268: stdout chunk (state=3): >>><<< 42613 1727204575.24469: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c09fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c09cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c09feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08111c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08120c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084ffb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0864140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0887950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0887fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0867c20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08653a0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084d160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08ab8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08aa510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0866240> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08a8d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d8980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084c3e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08d8e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d8ce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08d90d0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c084af00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d97c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08d9490> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08da6c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f48c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08f5fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f6e40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08f7470> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f6390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c08f7e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08f7590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08da6f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06cbce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06f8860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06f85c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06f8800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c06f89e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06c9e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06fa090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06f8d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c08dade0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c07223c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c073a510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c07772c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0799a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c07773e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c073b170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0578440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0739550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c06faf90> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f18c05786e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_od32s5cf/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05d2240> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05a9130> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05a8290> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05ab230> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05fdca0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05fda30> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05fd340> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05fd790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05d2c60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05fe960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05feba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05ff080> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c045ce90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c045eab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c045f440> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0460620> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0463110> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0463230> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04613d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0467050> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0465b20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0465880> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0465df0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04618e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04af200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04af380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04b0f80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04b0d40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04b3410> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04b1640> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04bac30> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04b35c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bba40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bba70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bbda0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04af680> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bf740> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04c06b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04bdeb0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c04bf230> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04bdac0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0548860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05496a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c04c30e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05496d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c054a4b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0552300> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c0552c30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c05a8f50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f18c05519d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c0552d80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c03e2f30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c035cc50> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c035ae40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f18c035aba0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 42613 1727204575.26326: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204575.26331: _low_level_execute_command(): starting 42613 1727204575.26338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204574.5594425-42805-217186559099536/ > /dev/null 2>&1 && sleep 0' 42613 1727204575.26677: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204575.26890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204575.27010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204575.27088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204575.29989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204575.30125: stderr chunk (state=3): >>><<< 42613 1727204575.30147: stdout chunk (state=3): >>><<< 42613 1727204575.30356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 42613 1727204575.30362: handler run complete 42613 1727204575.30365: attempt loop complete, returning result 42613 1727204575.30367: _execute() done 42613 1727204575.30371: dumping result to json 42613 1727204575.30373: done dumping result, returning 42613 1727204575.30376: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [127b8e07-fff9-2f91-05d8-0000000000c2] 42613 1727204575.30378: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c2 ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 42613 1727204575.30625: no more pending results, returning what we have 42613 1727204575.30629: results queue empty 42613 1727204575.30630: checking for any_errors_fatal 42613 1727204575.30639: done checking for any_errors_fatal 42613 1727204575.30640: checking for max_fail_percentage 42613 1727204575.30642: done checking for max_fail_percentage 42613 1727204575.30642: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.30643: done checking to see if all hosts have failed 42613 1727204575.30644: getting the remaining hosts for this loop 42613 1727204575.30645: done getting the remaining hosts for this loop 42613 1727204575.30649: getting the next task for host managed-node3 42613 1727204575.30656: done getting next task for host managed-node3 42613 1727204575.30658: ^ task is: TASK: Set flag to indicate system is ostree 42613 1727204575.30661: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.30665: getting variables 42613 1727204575.30668: in VariableManager get_vars() 42613 1727204575.30700: Calling all_inventory to load vars for managed-node3 42613 1727204575.30703: Calling groups_inventory to load vars for managed-node3 42613 1727204575.30707: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.30720: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.30723: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.30727: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.31274: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c2 42613 1727204575.31284: WORKER PROCESS EXITING 42613 1727204575.31300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.31770: done with get_vars() 42613 1727204575.31783: done getting variables 42613 1727204575.32089: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.848) 0:00:03.929 ***** 42613 1727204575.32121: entering _queue_task() for managed-node3/set_fact 42613 1727204575.32123: Creating lock for set_fact 42613 1727204575.32905: worker is 1 (out of 1 available) 42613 1727204575.32916: exiting _queue_task() for managed-node3/set_fact 42613 1727204575.32928: done queuing things up, now waiting for results queue to drain 42613 1727204575.32930: waiting for pending results... 42613 1727204575.33567: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 42613 1727204575.34154: in run() - task 127b8e07-fff9-2f91-05d8-0000000000c3 42613 1727204575.34159: variable 'ansible_search_path' from source: unknown 42613 1727204575.34162: variable 'ansible_search_path' from source: unknown 42613 1727204575.34167: calling self._execute() 42613 1727204575.34393: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.34408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.34529: variable 'omit' from source: magic vars 42613 1727204575.35627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204575.36276: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204575.36312: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204575.36419: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204575.36531: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204575.36772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204575.36776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204575.37039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204575.37042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204575.37464: Evaluated conditional (not __network_is_ostree is defined): True 42613 1727204575.37482: variable 'omit' from source: magic vars 42613 1727204575.37643: variable 'omit' from source: magic vars 42613 1727204575.37854: variable '__ostree_booted_stat' from source: set_fact 42613 1727204575.37980: variable 'omit' from source: magic vars 42613 1727204575.38068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204575.38102: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204575.38162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204575.38360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204575.38363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204575.38367: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204575.38369: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.38372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.38548: Set connection var ansible_shell_executable to /bin/sh 42613 1727204575.38792: Set connection var ansible_pipelining to False 42613 1727204575.38795: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204575.38798: Set connection var ansible_connection to ssh 42613 1727204575.38800: Set connection var ansible_timeout to 10 42613 1727204575.38803: Set connection var ansible_shell_type to sh 42613 1727204575.38805: variable 'ansible_shell_executable' from source: unknown 42613 1727204575.38807: variable 'ansible_connection' from source: unknown 42613 1727204575.38809: variable 'ansible_module_compression' from source: unknown 42613 1727204575.38811: variable 'ansible_shell_type' from source: unknown 42613 1727204575.38813: variable 'ansible_shell_executable' from source: unknown 42613 1727204575.38816: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.38818: variable 'ansible_pipelining' from source: unknown 42613 1727204575.38820: variable 'ansible_timeout' from source: unknown 42613 1727204575.38822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.39050: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204575.39129: variable 'omit' from source: magic vars 42613 1727204575.39138: starting attempt loop 42613 1727204575.39144: running the handler 42613 1727204575.39333: handler run complete 42613 1727204575.39336: attempt loop complete, returning result 42613 1727204575.39338: _execute() done 42613 1727204575.39340: dumping result to json 42613 1727204575.39342: done dumping result, returning 42613 1727204575.39344: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [127b8e07-fff9-2f91-05d8-0000000000c3] 42613 1727204575.39346: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c3 42613 1727204575.39423: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c3 42613 1727204575.39427: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 42613 1727204575.39495: no more pending results, returning what we have 42613 1727204575.39499: results queue empty 42613 1727204575.39500: checking for any_errors_fatal 42613 1727204575.39511: done checking for any_errors_fatal 42613 1727204575.39512: checking for max_fail_percentage 42613 1727204575.39514: done checking for max_fail_percentage 42613 1727204575.39515: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.39516: done checking to see if all hosts have failed 42613 1727204575.39516: getting the remaining hosts for this loop 42613 1727204575.39518: done getting the remaining hosts for this loop 42613 1727204575.39522: getting the next task for host managed-node3 42613 1727204575.39535: done getting next task for host managed-node3 42613 1727204575.39538: ^ task is: TASK: Fix CentOS6 Base repo 42613 1727204575.39542: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.39546: getting variables 42613 1727204575.39548: in VariableManager get_vars() 42613 1727204575.39582: Calling all_inventory to load vars for managed-node3 42613 1727204575.39585: Calling groups_inventory to load vars for managed-node3 42613 1727204575.39588: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.39600: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.39602: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.39612: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.40223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.41288: done with get_vars() 42613 1727204575.41303: done getting variables 42613 1727204575.41445: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.096) 0:00:04.025 ***** 42613 1727204575.41770: entering _queue_task() for managed-node3/copy 42613 1727204575.42427: worker is 1 (out of 1 available) 42613 1727204575.42444: exiting _queue_task() for managed-node3/copy 42613 1727204575.42458: done queuing things up, now waiting for results queue to drain 42613 1727204575.42459: waiting for pending results... 42613 1727204575.43076: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 42613 1727204575.43180: in run() - task 127b8e07-fff9-2f91-05d8-0000000000c5 42613 1727204575.43266: variable 'ansible_search_path' from source: unknown 42613 1727204575.43274: variable 'ansible_search_path' from source: unknown 42613 1727204575.43316: calling self._execute() 42613 1727204575.43515: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.43586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.43602: variable 'omit' from source: magic vars 42613 1727204575.44913: variable 'ansible_distribution' from source: facts 42613 1727204575.45047: Evaluated conditional (ansible_distribution == 'CentOS'): False 42613 1727204575.45053: when evaluation is False, skipping this task 42613 1727204575.45057: _execute() done 42613 1727204575.45060: dumping result to json 42613 1727204575.45063: done dumping result, returning 42613 1727204575.45069: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [127b8e07-fff9-2f91-05d8-0000000000c5] 42613 1727204575.45211: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c5 42613 1727204575.45301: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c5 42613 1727204575.45305: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 42613 1727204575.45381: no more pending results, returning what we have 42613 1727204575.45385: results queue empty 42613 1727204575.45386: checking for any_errors_fatal 42613 1727204575.45391: done checking for any_errors_fatal 42613 1727204575.45392: checking for max_fail_percentage 42613 1727204575.45394: done checking for max_fail_percentage 42613 1727204575.45395: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.45396: done checking to see if all hosts have failed 42613 1727204575.45396: getting the remaining hosts for this loop 42613 1727204575.45398: done getting the remaining hosts for this loop 42613 1727204575.45403: getting the next task for host managed-node3 42613 1727204575.45412: done getting next task for host managed-node3 42613 1727204575.45415: ^ task is: TASK: Include the task 'enable_epel.yml' 42613 1727204575.45419: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.45423: getting variables 42613 1727204575.45425: in VariableManager get_vars() 42613 1727204575.45459: Calling all_inventory to load vars for managed-node3 42613 1727204575.45462: Calling groups_inventory to load vars for managed-node3 42613 1727204575.45669: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.45683: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.45686: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.45689: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.46140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.46583: done with get_vars() 42613 1727204575.46597: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.051) 0:00:04.077 ***** 42613 1727204575.46915: entering _queue_task() for managed-node3/include_tasks 42613 1727204575.47478: worker is 1 (out of 1 available) 42613 1727204575.47493: exiting _queue_task() for managed-node3/include_tasks 42613 1727204575.47508: done queuing things up, now waiting for results queue to drain 42613 1727204575.47509: waiting for pending results... 42613 1727204575.48092: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 42613 1727204575.48223: in run() - task 127b8e07-fff9-2f91-05d8-0000000000c6 42613 1727204575.48276: variable 'ansible_search_path' from source: unknown 42613 1727204575.48620: variable 'ansible_search_path' from source: unknown 42613 1727204575.48625: calling self._execute() 42613 1727204575.49093: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.49098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.49101: variable 'omit' from source: magic vars 42613 1727204575.50347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204575.56151: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204575.56334: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204575.56383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204575.56508: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204575.56564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204575.56732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204575.56970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204575.56975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204575.57172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204575.57177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204575.57326: variable '__network_is_ostree' from source: set_fact 42613 1727204575.57423: Evaluated conditional (not __network_is_ostree | d(false)): True 42613 1727204575.57435: _execute() done 42613 1727204575.57443: dumping result to json 42613 1727204575.57450: done dumping result, returning 42613 1727204575.57461: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-2f91-05d8-0000000000c6] 42613 1727204575.57473: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c6 42613 1727204575.57836: no more pending results, returning what we have 42613 1727204575.57841: in VariableManager get_vars() 42613 1727204575.57879: Calling all_inventory to load vars for managed-node3 42613 1727204575.57882: Calling groups_inventory to load vars for managed-node3 42613 1727204575.57886: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.57898: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.57902: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.57905: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.58290: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000c6 42613 1727204575.58297: WORKER PROCESS EXITING 42613 1727204575.58325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.58768: done with get_vars() 42613 1727204575.58778: variable 'ansible_search_path' from source: unknown 42613 1727204575.58779: variable 'ansible_search_path' from source: unknown 42613 1727204575.58824: we have included files to process 42613 1727204575.58825: generating all_blocks data 42613 1727204575.58828: done generating all_blocks data 42613 1727204575.58837: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 42613 1727204575.58839: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 42613 1727204575.58842: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 42613 1727204575.60663: done processing included file 42613 1727204575.60669: iterating over new_blocks loaded from include file 42613 1727204575.60671: in VariableManager get_vars() 42613 1727204575.60686: done with get_vars() 42613 1727204575.60688: filtering new block on tags 42613 1727204575.60715: done filtering new block on tags 42613 1727204575.60719: in VariableManager get_vars() 42613 1727204575.60735: done with get_vars() 42613 1727204575.60737: filtering new block on tags 42613 1727204575.60751: done filtering new block on tags 42613 1727204575.60754: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 42613 1727204575.60761: extending task lists for all hosts with included blocks 42613 1727204575.61085: done extending task lists 42613 1727204575.61087: done processing included files 42613 1727204575.61088: results queue empty 42613 1727204575.61089: checking for any_errors_fatal 42613 1727204575.61092: done checking for any_errors_fatal 42613 1727204575.61093: checking for max_fail_percentage 42613 1727204575.61094: done checking for max_fail_percentage 42613 1727204575.61095: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.61096: done checking to see if all hosts have failed 42613 1727204575.61097: getting the remaining hosts for this loop 42613 1727204575.61098: done getting the remaining hosts for this loop 42613 1727204575.61101: getting the next task for host managed-node3 42613 1727204575.61106: done getting next task for host managed-node3 42613 1727204575.61108: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 42613 1727204575.61111: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.61114: getting variables 42613 1727204575.61115: in VariableManager get_vars() 42613 1727204575.61124: Calling all_inventory to load vars for managed-node3 42613 1727204575.61127: Calling groups_inventory to load vars for managed-node3 42613 1727204575.61130: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.61139: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.61149: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.61153: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.61563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.62004: done with get_vars() 42613 1727204575.62019: done getting variables 42613 1727204575.62316: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 42613 1727204575.62756: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.158) 0:00:04.236 ***** 42613 1727204575.62811: entering _queue_task() for managed-node3/command 42613 1727204575.62813: Creating lock for command 42613 1727204575.63584: worker is 1 (out of 1 available) 42613 1727204575.63601: exiting _queue_task() for managed-node3/command 42613 1727204575.63613: done queuing things up, now waiting for results queue to drain 42613 1727204575.63616: waiting for pending results... 42613 1727204575.64164: running TaskExecutor() for managed-node3/TASK: Create EPEL 40 42613 1727204575.64478: in run() - task 127b8e07-fff9-2f91-05d8-0000000000e0 42613 1727204575.64497: variable 'ansible_search_path' from source: unknown 42613 1727204575.64502: variable 'ansible_search_path' from source: unknown 42613 1727204575.64572: calling self._execute() 42613 1727204575.64744: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.64748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.64760: variable 'omit' from source: magic vars 42613 1727204575.65678: variable 'ansible_distribution' from source: facts 42613 1727204575.65701: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 42613 1727204575.65704: when evaluation is False, skipping this task 42613 1727204575.65707: _execute() done 42613 1727204575.65710: dumping result to json 42613 1727204575.65713: done dumping result, returning 42613 1727204575.65716: done running TaskExecutor() for managed-node3/TASK: Create EPEL 40 [127b8e07-fff9-2f91-05d8-0000000000e0] 42613 1727204575.65718: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e0 42613 1727204575.65905: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e0 42613 1727204575.65910: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 42613 1727204575.65981: no more pending results, returning what we have 42613 1727204575.65985: results queue empty 42613 1727204575.65986: checking for any_errors_fatal 42613 1727204575.65988: done checking for any_errors_fatal 42613 1727204575.65988: checking for max_fail_percentage 42613 1727204575.65990: done checking for max_fail_percentage 42613 1727204575.65991: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.65992: done checking to see if all hosts have failed 42613 1727204575.65993: getting the remaining hosts for this loop 42613 1727204575.65994: done getting the remaining hosts for this loop 42613 1727204575.65999: getting the next task for host managed-node3 42613 1727204575.66009: done getting next task for host managed-node3 42613 1727204575.66012: ^ task is: TASK: Install yum-utils package 42613 1727204575.66018: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.66022: getting variables 42613 1727204575.66024: in VariableManager get_vars() 42613 1727204575.66063: Calling all_inventory to load vars for managed-node3 42613 1727204575.66068: Calling groups_inventory to load vars for managed-node3 42613 1727204575.66072: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.66088: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.66091: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.66094: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.66715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.67882: done with get_vars() 42613 1727204575.67896: done getting variables 42613 1727204575.68011: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.052) 0:00:04.288 ***** 42613 1727204575.68046: entering _queue_task() for managed-node3/package 42613 1727204575.68048: Creating lock for package 42613 1727204575.68813: worker is 1 (out of 1 available) 42613 1727204575.68827: exiting _queue_task() for managed-node3/package 42613 1727204575.68841: done queuing things up, now waiting for results queue to drain 42613 1727204575.68842: waiting for pending results... 42613 1727204575.69286: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 42613 1727204575.69292: in run() - task 127b8e07-fff9-2f91-05d8-0000000000e1 42613 1727204575.69295: variable 'ansible_search_path' from source: unknown 42613 1727204575.69297: variable 'ansible_search_path' from source: unknown 42613 1727204575.69385: calling self._execute() 42613 1727204575.69475: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.69479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.69484: variable 'omit' from source: magic vars 42613 1727204575.69960: variable 'ansible_distribution' from source: facts 42613 1727204575.69996: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 42613 1727204575.70004: when evaluation is False, skipping this task 42613 1727204575.70023: _execute() done 42613 1727204575.70125: dumping result to json 42613 1727204575.70129: done dumping result, returning 42613 1727204575.70132: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [127b8e07-fff9-2f91-05d8-0000000000e1] 42613 1727204575.70135: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e1 42613 1727204575.70228: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e1 42613 1727204575.70232: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 42613 1727204575.70417: no more pending results, returning what we have 42613 1727204575.70421: results queue empty 42613 1727204575.70422: checking for any_errors_fatal 42613 1727204575.70429: done checking for any_errors_fatal 42613 1727204575.70430: checking for max_fail_percentage 42613 1727204575.70432: done checking for max_fail_percentage 42613 1727204575.70435: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.70435: done checking to see if all hosts have failed 42613 1727204575.70436: getting the remaining hosts for this loop 42613 1727204575.70437: done getting the remaining hosts for this loop 42613 1727204575.70441: getting the next task for host managed-node3 42613 1727204575.70448: done getting next task for host managed-node3 42613 1727204575.70450: ^ task is: TASK: Enable EPEL 7 42613 1727204575.70454: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.70457: getting variables 42613 1727204575.70458: in VariableManager get_vars() 42613 1727204575.70489: Calling all_inventory to load vars for managed-node3 42613 1727204575.70492: Calling groups_inventory to load vars for managed-node3 42613 1727204575.70495: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.70505: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.70508: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.70511: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.70713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.71163: done with get_vars() 42613 1727204575.71380: done getting variables 42613 1727204575.71450: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.034) 0:00:04.323 ***** 42613 1727204575.71489: entering _queue_task() for managed-node3/command 42613 1727204575.72244: worker is 1 (out of 1 available) 42613 1727204575.72260: exiting _queue_task() for managed-node3/command 42613 1727204575.72278: done queuing things up, now waiting for results queue to drain 42613 1727204575.72280: waiting for pending results... 42613 1727204575.73036: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 42613 1727204575.73248: in run() - task 127b8e07-fff9-2f91-05d8-0000000000e2 42613 1727204575.73252: variable 'ansible_search_path' from source: unknown 42613 1727204575.73255: variable 'ansible_search_path' from source: unknown 42613 1727204575.73258: calling self._execute() 42613 1727204575.73754: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.73757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.73760: variable 'omit' from source: magic vars 42613 1727204575.74240: variable 'ansible_distribution' from source: facts 42613 1727204575.74254: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 42613 1727204575.74258: when evaluation is False, skipping this task 42613 1727204575.74261: _execute() done 42613 1727204575.74264: dumping result to json 42613 1727204575.74268: done dumping result, returning 42613 1727204575.74277: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [127b8e07-fff9-2f91-05d8-0000000000e2] 42613 1727204575.74282: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e2 42613 1727204575.74388: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e2 42613 1727204575.74391: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 42613 1727204575.74453: no more pending results, returning what we have 42613 1727204575.74457: results queue empty 42613 1727204575.74458: checking for any_errors_fatal 42613 1727204575.74464: done checking for any_errors_fatal 42613 1727204575.74467: checking for max_fail_percentage 42613 1727204575.74469: done checking for max_fail_percentage 42613 1727204575.74470: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.74471: done checking to see if all hosts have failed 42613 1727204575.74472: getting the remaining hosts for this loop 42613 1727204575.74474: done getting the remaining hosts for this loop 42613 1727204575.74478: getting the next task for host managed-node3 42613 1727204575.74487: done getting next task for host managed-node3 42613 1727204575.74490: ^ task is: TASK: Enable EPEL 8 42613 1727204575.74495: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.74499: getting variables 42613 1727204575.74501: in VariableManager get_vars() 42613 1727204575.74539: Calling all_inventory to load vars for managed-node3 42613 1727204575.74543: Calling groups_inventory to load vars for managed-node3 42613 1727204575.74547: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.74566: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.74789: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.74794: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.75063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.75514: done with get_vars() 42613 1727204575.75527: done getting variables 42613 1727204575.75803: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.043) 0:00:04.366 ***** 42613 1727204575.75838: entering _queue_task() for managed-node3/command 42613 1727204575.76606: worker is 1 (out of 1 available) 42613 1727204575.76620: exiting _queue_task() for managed-node3/command 42613 1727204575.76638: done queuing things up, now waiting for results queue to drain 42613 1727204575.76639: waiting for pending results... 42613 1727204575.77289: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 42613 1727204575.77295: in run() - task 127b8e07-fff9-2f91-05d8-0000000000e3 42613 1727204575.77394: variable 'ansible_search_path' from source: unknown 42613 1727204575.77600: variable 'ansible_search_path' from source: unknown 42613 1727204575.77605: calling self._execute() 42613 1727204575.77742: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.77758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.77775: variable 'omit' from source: magic vars 42613 1727204575.78647: variable 'ansible_distribution' from source: facts 42613 1727204575.78670: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 42613 1727204575.78687: when evaluation is False, skipping this task 42613 1727204575.78872: _execute() done 42613 1727204575.78876: dumping result to json 42613 1727204575.78880: done dumping result, returning 42613 1727204575.78883: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [127b8e07-fff9-2f91-05d8-0000000000e3] 42613 1727204575.78886: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e3 42613 1727204575.78972: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e3 42613 1727204575.78976: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 42613 1727204575.79030: no more pending results, returning what we have 42613 1727204575.79036: results queue empty 42613 1727204575.79037: checking for any_errors_fatal 42613 1727204575.79044: done checking for any_errors_fatal 42613 1727204575.79044: checking for max_fail_percentage 42613 1727204575.79047: done checking for max_fail_percentage 42613 1727204575.79047: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.79048: done checking to see if all hosts have failed 42613 1727204575.79049: getting the remaining hosts for this loop 42613 1727204575.79050: done getting the remaining hosts for this loop 42613 1727204575.79054: getting the next task for host managed-node3 42613 1727204575.79064: done getting next task for host managed-node3 42613 1727204575.79069: ^ task is: TASK: Enable EPEL 6 42613 1727204575.79074: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.79077: getting variables 42613 1727204575.79079: in VariableManager get_vars() 42613 1727204575.79112: Calling all_inventory to load vars for managed-node3 42613 1727204575.79115: Calling groups_inventory to load vars for managed-node3 42613 1727204575.79119: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.79138: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.79142: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.79145: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.79661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.80335: done with get_vars() 42613 1727204575.80351: done getting variables 42613 1727204575.80420: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.046) 0:00:04.412 ***** 42613 1727204575.80459: entering _queue_task() for managed-node3/copy 42613 1727204575.81226: worker is 1 (out of 1 available) 42613 1727204575.81244: exiting _queue_task() for managed-node3/copy 42613 1727204575.81259: done queuing things up, now waiting for results queue to drain 42613 1727204575.81261: waiting for pending results... 42613 1727204575.81974: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 42613 1727204575.82177: in run() - task 127b8e07-fff9-2f91-05d8-0000000000e5 42613 1727204575.82207: variable 'ansible_search_path' from source: unknown 42613 1727204575.82216: variable 'ansible_search_path' from source: unknown 42613 1727204575.82263: calling self._execute() 42613 1727204575.82673: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.82678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.82681: variable 'omit' from source: magic vars 42613 1727204575.83837: variable 'ansible_distribution' from source: facts 42613 1727204575.83945: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 42613 1727204575.83949: when evaluation is False, skipping this task 42613 1727204575.83951: _execute() done 42613 1727204575.83954: dumping result to json 42613 1727204575.83956: done dumping result, returning 42613 1727204575.83959: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [127b8e07-fff9-2f91-05d8-0000000000e5] 42613 1727204575.83962: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e5 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 42613 1727204575.84107: no more pending results, returning what we have 42613 1727204575.84111: results queue empty 42613 1727204575.84111: checking for any_errors_fatal 42613 1727204575.84117: done checking for any_errors_fatal 42613 1727204575.84118: checking for max_fail_percentage 42613 1727204575.84121: done checking for max_fail_percentage 42613 1727204575.84121: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.84122: done checking to see if all hosts have failed 42613 1727204575.84123: getting the remaining hosts for this loop 42613 1727204575.84124: done getting the remaining hosts for this loop 42613 1727204575.84129: getting the next task for host managed-node3 42613 1727204575.84141: done getting next task for host managed-node3 42613 1727204575.84145: ^ task is: TASK: Set network provider to 'nm' 42613 1727204575.84147: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.84151: getting variables 42613 1727204575.84153: in VariableManager get_vars() 42613 1727204575.84296: Calling all_inventory to load vars for managed-node3 42613 1727204575.84299: Calling groups_inventory to load vars for managed-node3 42613 1727204575.84303: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.84311: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000e5 42613 1727204575.84317: WORKER PROCESS EXITING 42613 1727204575.84336: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.84341: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.84345: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.84947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.85449: done with get_vars() 42613 1727204575.85500: done getting variables 42613 1727204575.85575: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:13 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.051) 0:00:04.464 ***** 42613 1727204575.85612: entering _queue_task() for managed-node3/set_fact 42613 1727204575.86456: worker is 1 (out of 1 available) 42613 1727204575.86474: exiting _queue_task() for managed-node3/set_fact 42613 1727204575.86490: done queuing things up, now waiting for results queue to drain 42613 1727204575.86492: waiting for pending results... 42613 1727204575.87186: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 42613 1727204575.87375: in run() - task 127b8e07-fff9-2f91-05d8-000000000007 42613 1727204575.87380: variable 'ansible_search_path' from source: unknown 42613 1727204575.87385: calling self._execute() 42613 1727204575.87661: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.87678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.87693: variable 'omit' from source: magic vars 42613 1727204575.87948: variable 'omit' from source: magic vars 42613 1727204575.88009: variable 'omit' from source: magic vars 42613 1727204575.88104: variable 'omit' from source: magic vars 42613 1727204575.88173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204575.88261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204575.88296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204575.88325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204575.88345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204575.88398: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204575.88409: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.88424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.88549: Set connection var ansible_shell_executable to /bin/sh 42613 1727204575.88638: Set connection var ansible_pipelining to False 42613 1727204575.88642: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204575.88644: Set connection var ansible_connection to ssh 42613 1727204575.88647: Set connection var ansible_timeout to 10 42613 1727204575.88649: Set connection var ansible_shell_type to sh 42613 1727204575.88651: variable 'ansible_shell_executable' from source: unknown 42613 1727204575.88653: variable 'ansible_connection' from source: unknown 42613 1727204575.88655: variable 'ansible_module_compression' from source: unknown 42613 1727204575.88657: variable 'ansible_shell_type' from source: unknown 42613 1727204575.88660: variable 'ansible_shell_executable' from source: unknown 42613 1727204575.88662: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.88664: variable 'ansible_pipelining' from source: unknown 42613 1727204575.88668: variable 'ansible_timeout' from source: unknown 42613 1727204575.88670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.88832: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204575.88856: variable 'omit' from source: magic vars 42613 1727204575.88868: starting attempt loop 42613 1727204575.88877: running the handler 42613 1727204575.88895: handler run complete 42613 1727204575.88912: attempt loop complete, returning result 42613 1727204575.88919: _execute() done 42613 1727204575.88928: dumping result to json 42613 1727204575.88963: done dumping result, returning 42613 1727204575.88966: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [127b8e07-fff9-2f91-05d8-000000000007] 42613 1727204575.88971: sending task result for task 127b8e07-fff9-2f91-05d8-000000000007 ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 42613 1727204575.89125: no more pending results, returning what we have 42613 1727204575.89128: results queue empty 42613 1727204575.89129: checking for any_errors_fatal 42613 1727204575.89136: done checking for any_errors_fatal 42613 1727204575.89137: checking for max_fail_percentage 42613 1727204575.89139: done checking for max_fail_percentage 42613 1727204575.89140: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.89141: done checking to see if all hosts have failed 42613 1727204575.89141: getting the remaining hosts for this loop 42613 1727204575.89143: done getting the remaining hosts for this loop 42613 1727204575.89147: getting the next task for host managed-node3 42613 1727204575.89155: done getting next task for host managed-node3 42613 1727204575.89158: ^ task is: TASK: meta (flush_handlers) 42613 1727204575.89159: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.89168: getting variables 42613 1727204575.89170: in VariableManager get_vars() 42613 1727204575.89204: Calling all_inventory to load vars for managed-node3 42613 1727204575.89207: Calling groups_inventory to load vars for managed-node3 42613 1727204575.89210: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.89219: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000007 42613 1727204575.89222: WORKER PROCESS EXITING 42613 1727204575.89280: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.89284: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.89288: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.89514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.89754: done with get_vars() 42613 1727204575.89772: done getting variables 42613 1727204575.89855: in VariableManager get_vars() 42613 1727204575.89870: Calling all_inventory to load vars for managed-node3 42613 1727204575.89873: Calling groups_inventory to load vars for managed-node3 42613 1727204575.89875: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.89881: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.89883: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.89886: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.90505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.90757: done with get_vars() 42613 1727204575.90781: done queuing things up, now waiting for results queue to drain 42613 1727204575.90783: results queue empty 42613 1727204575.90784: checking for any_errors_fatal 42613 1727204575.90788: done checking for any_errors_fatal 42613 1727204575.90789: checking for max_fail_percentage 42613 1727204575.90790: done checking for max_fail_percentage 42613 1727204575.90790: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.90791: done checking to see if all hosts have failed 42613 1727204575.90792: getting the remaining hosts for this loop 42613 1727204575.90793: done getting the remaining hosts for this loop 42613 1727204575.90796: getting the next task for host managed-node3 42613 1727204575.90801: done getting next task for host managed-node3 42613 1727204575.90803: ^ task is: TASK: meta (flush_handlers) 42613 1727204575.90804: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.90820: getting variables 42613 1727204575.90822: in VariableManager get_vars() 42613 1727204575.90836: Calling all_inventory to load vars for managed-node3 42613 1727204575.90839: Calling groups_inventory to load vars for managed-node3 42613 1727204575.90842: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.90847: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.90850: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.90852: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.91022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.91268: done with get_vars() 42613 1727204575.91279: done getting variables 42613 1727204575.91336: in VariableManager get_vars() 42613 1727204575.91347: Calling all_inventory to load vars for managed-node3 42613 1727204575.91349: Calling groups_inventory to load vars for managed-node3 42613 1727204575.91352: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.91362: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.91366: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.91370: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.91575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.91839: done with get_vars() 42613 1727204575.91854: done queuing things up, now waiting for results queue to drain 42613 1727204575.91856: results queue empty 42613 1727204575.91857: checking for any_errors_fatal 42613 1727204575.91858: done checking for any_errors_fatal 42613 1727204575.91859: checking for max_fail_percentage 42613 1727204575.91860: done checking for max_fail_percentage 42613 1727204575.91861: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.91862: done checking to see if all hosts have failed 42613 1727204575.91862: getting the remaining hosts for this loop 42613 1727204575.91863: done getting the remaining hosts for this loop 42613 1727204575.91973: getting the next task for host managed-node3 42613 1727204575.91978: done getting next task for host managed-node3 42613 1727204575.91979: ^ task is: None 42613 1727204575.91981: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.91992: done queuing things up, now waiting for results queue to drain 42613 1727204575.91993: results queue empty 42613 1727204575.91994: checking for any_errors_fatal 42613 1727204575.91995: done checking for any_errors_fatal 42613 1727204575.91996: checking for max_fail_percentage 42613 1727204575.91997: done checking for max_fail_percentage 42613 1727204575.91998: checking to see if all hosts have failed and the running result is not ok 42613 1727204575.91998: done checking to see if all hosts have failed 42613 1727204575.92001: getting the next task for host managed-node3 42613 1727204575.92008: done getting next task for host managed-node3 42613 1727204575.92009: ^ task is: None 42613 1727204575.92011: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.92108: in VariableManager get_vars() 42613 1727204575.92141: done with get_vars() 42613 1727204575.92148: in VariableManager get_vars() 42613 1727204575.92162: done with get_vars() 42613 1727204575.92169: variable 'omit' from source: magic vars 42613 1727204575.92204: in VariableManager get_vars() 42613 1727204575.92223: done with get_vars() 42613 1727204575.92258: variable 'omit' from source: magic vars PLAY [Test for testing routing rules] ****************************************** 42613 1727204575.92624: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 42613 1727204575.92662: getting the remaining hosts for this loop 42613 1727204575.92663: done getting the remaining hosts for this loop 42613 1727204575.92668: getting the next task for host managed-node3 42613 1727204575.92672: done getting next task for host managed-node3 42613 1727204575.92674: ^ task is: TASK: Gathering Facts 42613 1727204575.92676: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204575.92678: getting variables 42613 1727204575.92679: in VariableManager get_vars() 42613 1727204575.92692: Calling all_inventory to load vars for managed-node3 42613 1727204575.92694: Calling groups_inventory to load vars for managed-node3 42613 1727204575.92697: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204575.92703: Calling all_plugins_play to load vars for managed-node3 42613 1727204575.92719: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204575.92722: Calling groups_plugins_play to load vars for managed-node3 42613 1727204575.92916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204575.93171: done with get_vars() 42613 1727204575.93182: done getting variables 42613 1727204575.93240: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.076) 0:00:04.540 ***** 42613 1727204575.93273: entering _queue_task() for managed-node3/gather_facts 42613 1727204575.93635: worker is 1 (out of 1 available) 42613 1727204575.93650: exiting _queue_task() for managed-node3/gather_facts 42613 1727204575.93663: done queuing things up, now waiting for results queue to drain 42613 1727204575.93664: waiting for pending results... 42613 1727204575.93892: running TaskExecutor() for managed-node3/TASK: Gathering Facts 42613 1727204575.94046: in run() - task 127b8e07-fff9-2f91-05d8-00000000010b 42613 1727204575.94050: variable 'ansible_search_path' from source: unknown 42613 1727204575.94157: calling self._execute() 42613 1727204575.94193: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.94206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.94222: variable 'omit' from source: magic vars 42613 1727204575.95511: variable 'ansible_distribution_major_version' from source: facts 42613 1727204575.95515: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204575.95517: variable 'omit' from source: magic vars 42613 1727204575.95519: variable 'omit' from source: magic vars 42613 1727204575.95578: variable 'omit' from source: magic vars 42613 1727204575.95635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204575.95684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204575.95712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204575.95740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204575.95757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204575.95798: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204575.95805: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.95812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.95931: Set connection var ansible_shell_executable to /bin/sh 42613 1727204575.95950: Set connection var ansible_pipelining to False 42613 1727204575.95965: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204575.96052: Set connection var ansible_connection to ssh 42613 1727204575.96055: Set connection var ansible_timeout to 10 42613 1727204575.96057: Set connection var ansible_shell_type to sh 42613 1727204575.96059: variable 'ansible_shell_executable' from source: unknown 42613 1727204575.96061: variable 'ansible_connection' from source: unknown 42613 1727204575.96063: variable 'ansible_module_compression' from source: unknown 42613 1727204575.96067: variable 'ansible_shell_type' from source: unknown 42613 1727204575.96069: variable 'ansible_shell_executable' from source: unknown 42613 1727204575.96071: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204575.96072: variable 'ansible_pipelining' from source: unknown 42613 1727204575.96074: variable 'ansible_timeout' from source: unknown 42613 1727204575.96076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204575.96259: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204575.96584: variable 'omit' from source: magic vars 42613 1727204575.96587: starting attempt loop 42613 1727204575.96590: running the handler 42613 1727204575.96593: variable 'ansible_facts' from source: unknown 42613 1727204575.96595: _low_level_execute_command(): starting 42613 1727204575.96597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204575.97439: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204575.97468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204575.97486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204575.97585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204575.97619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204575.97663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204575.97736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204576.00337: stdout chunk (state=3): >>>/root <<< 42613 1727204576.00548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204576.00743: stderr chunk (state=3): >>><<< 42613 1727204576.00747: stdout chunk (state=3): >>><<< 42613 1727204576.00781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 42613 1727204576.00818: _low_level_execute_command(): starting 42613 1727204576.00826: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212 `" && echo ansible-tmp-1727204576.007808-42915-69202067754212="` echo /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212 `" ) && sleep 0' 42613 1727204576.01527: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204576.01545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204576.01562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204576.01587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204576.01603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204576.01624: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204576.01639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204576.01658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204576.01684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204576.01695: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204576.01707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204576.01801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204576.01813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204576.01959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204576.05305: stdout chunk (state=3): >>>ansible-tmp-1727204576.007808-42915-69202067754212=/root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212 <<< 42613 1727204576.05667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204576.05672: stdout chunk (state=3): >>><<< 42613 1727204576.05674: stderr chunk (state=3): >>><<< 42613 1727204576.05879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204576.007808-42915-69202067754212=/root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 42613 1727204576.05884: variable 'ansible_module_compression' from source: unknown 42613 1727204576.05886: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 42613 1727204576.05889: variable 'ansible_facts' from source: unknown 42613 1727204576.06077: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/AnsiballZ_setup.py 42613 1727204576.06331: Sending initial data 42613 1727204576.06341: Sent initial data (152 bytes) 42613 1727204576.07001: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204576.07017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204576.07031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204576.07050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204576.07065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204576.07086: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204576.07099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204576.07190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204576.07213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204576.07239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204576.07253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204576.07362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204576.10011: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204576.10119: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204576.10242: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpk958ax6l /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/AnsiballZ_setup.py <<< 42613 1727204576.10246: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/AnsiballZ_setup.py" <<< 42613 1727204576.10312: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpk958ax6l" to remote "/root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/AnsiballZ_setup.py" <<< 42613 1727204576.12243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204576.12398: stderr chunk (state=3): >>><<< 42613 1727204576.12402: stdout chunk (state=3): >>><<< 42613 1727204576.12404: done transferring module to remote 42613 1727204576.12407: _low_level_execute_command(): starting 42613 1727204576.12409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/ /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/AnsiballZ_setup.py && sleep 0' 42613 1727204576.13050: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204576.13071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204576.13086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204576.13126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204576.13243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204576.13261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204576.13289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204576.13397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204576.16362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204576.16389: stdout chunk (state=3): >>><<< 42613 1727204576.16410: stderr chunk (state=3): >>><<< 42613 1727204576.16523: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 42613 1727204576.16527: _low_level_execute_command(): starting 42613 1727204576.16530: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/AnsiballZ_setup.py && sleep 0' 42613 1727204576.17194: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204576.17214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204576.17244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204576.17264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204576.17287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204576.17348: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204576.17408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204576.17429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204576.17467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204576.17598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204577.12641: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3021, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 695, "free": 3021}, "nocache": {"free": 3465, "used": 251}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {<<< 42613 1727204577.12737: stdout chunk (state=3): >>>"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 914, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303763968, "block_size": 4096, "block_total": 64479564, "block_available": 61353458, "block_used": 3126106, "inode_total": 16384000, "inode_available": 16301442, "inode_used": 82558, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "57", "epoch": "1727204577", "epoch_int": "1727204577", "date": "2024-09-24", "time": "15:02:57", "iso8601_micro": "2024-09-24T19:02:57.067546Z", "iso8601": "2024-09-24T19:02:57Z", "iso8601_basic": "20240924T150257067546", "iso8601_basic_short": "20240924T150257", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.712890625, "5m": 0.654296875, "15m": 0.42626953125}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 42613 1727204577.15839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204577.15958: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 42613 1727204577.15977: stderr chunk (state=3): >>><<< 42613 1727204577.15986: stdout chunk (state=3): >>><<< 42613 1727204577.16030: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3021, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 695, "free": 3021}, "nocache": {"free": 3465, "used": 251}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 914, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303763968, "block_size": 4096, "block_total": 64479564, "block_available": 61353458, "block_used": 3126106, "inode_total": 16384000, "inode_available": 16301442, "inode_used": 82558, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "57", "epoch": "1727204577", "epoch_int": "1727204577", "date": "2024-09-24", "time": "15:02:57", "iso8601_micro": "2024-09-24T19:02:57.067546Z", "iso8601": "2024-09-24T19:02:57Z", "iso8601_basic": "20240924T150257067546", "iso8601_basic_short": "20240924T150257", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.712890625, "5m": 0.654296875, "15m": 0.42626953125}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204577.16579: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204577.16583: _low_level_execute_command(): starting 42613 1727204577.16586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204576.007808-42915-69202067754212/ > /dev/null 2>&1 && sleep 0' 42613 1727204577.17300: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204577.17322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204577.17356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204577.17470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204577.17844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204577.17936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 42613 1727204577.20935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204577.20940: stdout chunk (state=3): >>><<< 42613 1727204577.20942: stderr chunk (state=3): >>><<< 42613 1727204577.21173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 42613 1727204577.21180: handler run complete 42613 1727204577.21483: variable 'ansible_facts' from source: unknown 42613 1727204577.22173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.23281: variable 'ansible_facts' from source: unknown 42613 1727204577.23564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.23905: attempt loop complete, returning result 42613 1727204577.23962: _execute() done 42613 1727204577.23975: dumping result to json 42613 1727204577.24022: done dumping result, returning 42613 1727204577.24075: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-2f91-05d8-00000000010b] 42613 1727204577.24110: sending task result for task 127b8e07-fff9-2f91-05d8-00000000010b 42613 1727204577.25426: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000010b 42613 1727204577.25431: WORKER PROCESS EXITING ok: [managed-node3] 42613 1727204577.26172: no more pending results, returning what we have 42613 1727204577.26175: results queue empty 42613 1727204577.26176: checking for any_errors_fatal 42613 1727204577.26178: done checking for any_errors_fatal 42613 1727204577.26179: checking for max_fail_percentage 42613 1727204577.26180: done checking for max_fail_percentage 42613 1727204577.26181: checking to see if all hosts have failed and the running result is not ok 42613 1727204577.26182: done checking to see if all hosts have failed 42613 1727204577.26183: getting the remaining hosts for this loop 42613 1727204577.26184: done getting the remaining hosts for this loop 42613 1727204577.26188: getting the next task for host managed-node3 42613 1727204577.26193: done getting next task for host managed-node3 42613 1727204577.26195: ^ task is: TASK: meta (flush_handlers) 42613 1727204577.26197: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204577.26201: getting variables 42613 1727204577.26202: in VariableManager get_vars() 42613 1727204577.26352: Calling all_inventory to load vars for managed-node3 42613 1727204577.26356: Calling groups_inventory to load vars for managed-node3 42613 1727204577.26358: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.26372: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.26375: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.26378: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.26803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.27250: done with get_vars() 42613 1727204577.27267: done getting variables 42613 1727204577.27473: in VariableManager get_vars() 42613 1727204577.27489: Calling all_inventory to load vars for managed-node3 42613 1727204577.27491: Calling groups_inventory to load vars for managed-node3 42613 1727204577.27494: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.27499: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.27502: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.27505: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.28007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.28476: done with get_vars() 42613 1727204577.28494: done queuing things up, now waiting for results queue to drain 42613 1727204577.28496: results queue empty 42613 1727204577.28497: checking for any_errors_fatal 42613 1727204577.28502: done checking for any_errors_fatal 42613 1727204577.28503: checking for max_fail_percentage 42613 1727204577.28504: done checking for max_fail_percentage 42613 1727204577.28504: checking to see if all hosts have failed and the running result is not ok 42613 1727204577.28627: done checking to see if all hosts have failed 42613 1727204577.28629: getting the remaining hosts for this loop 42613 1727204577.28630: done getting the remaining hosts for this loop 42613 1727204577.28636: getting the next task for host managed-node3 42613 1727204577.28641: done getting next task for host managed-node3 42613 1727204577.28644: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 42613 1727204577.28646: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204577.28648: getting variables 42613 1727204577.28649: in VariableManager get_vars() 42613 1727204577.28663: Calling all_inventory to load vars for managed-node3 42613 1727204577.28668: Calling groups_inventory to load vars for managed-node3 42613 1727204577.28670: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.28677: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.28680: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.28683: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.29052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.29985: done with get_vars() 42613 1727204577.30001: done getting variables 42613 1727204577.30264: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204577.30551: variable 'type' from source: play vars 42613 1727204577.30560: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:10 Tuesday 24 September 2024 15:02:57 -0400 (0:00:01.374) 0:00:05.915 ***** 42613 1727204577.30692: entering _queue_task() for managed-node3/set_fact 42613 1727204577.31505: worker is 1 (out of 1 available) 42613 1727204577.31521: exiting _queue_task() for managed-node3/set_fact 42613 1727204577.31538: done queuing things up, now waiting for results queue to drain 42613 1727204577.31539: waiting for pending results... 42613 1727204577.32048: running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 42613 1727204577.32145: in run() - task 127b8e07-fff9-2f91-05d8-00000000000b 42613 1727204577.32149: variable 'ansible_search_path' from source: unknown 42613 1727204577.32617: calling self._execute() 42613 1727204577.32789: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.32793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.32872: variable 'omit' from source: magic vars 42613 1727204577.33573: variable 'ansible_distribution_major_version' from source: facts 42613 1727204577.33577: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204577.33580: variable 'omit' from source: magic vars 42613 1727204577.33785: variable 'omit' from source: magic vars 42613 1727204577.33820: variable 'type' from source: play vars 42613 1727204577.33914: variable 'type' from source: play vars 42613 1727204577.33924: variable 'interface' from source: play vars 42613 1727204577.34371: variable 'interface' from source: play vars 42613 1727204577.34375: variable 'omit' from source: magic vars 42613 1727204577.34380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204577.34383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204577.34386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204577.34398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204577.34402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204577.34595: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204577.34599: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.34601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.34839: Set connection var ansible_shell_executable to /bin/sh 42613 1727204577.34842: Set connection var ansible_pipelining to False 42613 1727204577.34845: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204577.34847: Set connection var ansible_connection to ssh 42613 1727204577.34849: Set connection var ansible_timeout to 10 42613 1727204577.34852: Set connection var ansible_shell_type to sh 42613 1727204577.34854: variable 'ansible_shell_executable' from source: unknown 42613 1727204577.35054: variable 'ansible_connection' from source: unknown 42613 1727204577.35057: variable 'ansible_module_compression' from source: unknown 42613 1727204577.35060: variable 'ansible_shell_type' from source: unknown 42613 1727204577.35062: variable 'ansible_shell_executable' from source: unknown 42613 1727204577.35064: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.35086: variable 'ansible_pipelining' from source: unknown 42613 1727204577.35089: variable 'ansible_timeout' from source: unknown 42613 1727204577.35092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.35387: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204577.35390: variable 'omit' from source: magic vars 42613 1727204577.35393: starting attempt loop 42613 1727204577.35396: running the handler 42613 1727204577.35398: handler run complete 42613 1727204577.35494: attempt loop complete, returning result 42613 1727204577.35497: _execute() done 42613 1727204577.35499: dumping result to json 42613 1727204577.35501: done dumping result, returning 42613 1727204577.35503: done running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 [127b8e07-fff9-2f91-05d8-00000000000b] 42613 1727204577.35505: sending task result for task 127b8e07-fff9-2f91-05d8-00000000000b ok: [managed-node3] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 42613 1727204577.35783: no more pending results, returning what we have 42613 1727204577.35787: results queue empty 42613 1727204577.35789: checking for any_errors_fatal 42613 1727204577.35791: done checking for any_errors_fatal 42613 1727204577.35792: checking for max_fail_percentage 42613 1727204577.35799: done checking for max_fail_percentage 42613 1727204577.35800: checking to see if all hosts have failed and the running result is not ok 42613 1727204577.35801: done checking to see if all hosts have failed 42613 1727204577.35801: getting the remaining hosts for this loop 42613 1727204577.35803: done getting the remaining hosts for this loop 42613 1727204577.35807: getting the next task for host managed-node3 42613 1727204577.35824: done getting next task for host managed-node3 42613 1727204577.35827: ^ task is: TASK: Include the task 'show_interfaces.yml' 42613 1727204577.35829: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204577.35837: getting variables 42613 1727204577.35839: in VariableManager get_vars() 42613 1727204577.35883: Calling all_inventory to load vars for managed-node3 42613 1727204577.35887: Calling groups_inventory to load vars for managed-node3 42613 1727204577.35889: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.36160: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.36164: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.36169: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.36996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.37681: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000000b 42613 1727204577.37686: WORKER PROCESS EXITING 42613 1727204577.37735: done with get_vars() 42613 1727204577.37751: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:14 Tuesday 24 September 2024 15:02:57 -0400 (0:00:00.073) 0:00:05.988 ***** 42613 1727204577.38030: entering _queue_task() for managed-node3/include_tasks 42613 1727204577.38907: worker is 1 (out of 1 available) 42613 1727204577.38930: exiting _queue_task() for managed-node3/include_tasks 42613 1727204577.38947: done queuing things up, now waiting for results queue to drain 42613 1727204577.38949: waiting for pending results... 42613 1727204577.39974: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 42613 1727204577.40064: in run() - task 127b8e07-fff9-2f91-05d8-00000000000c 42613 1727204577.40821: variable 'ansible_search_path' from source: unknown 42613 1727204577.40830: calling self._execute() 42613 1727204577.40989: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.40994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.40998: variable 'omit' from source: magic vars 42613 1727204577.42676: variable 'ansible_distribution_major_version' from source: facts 42613 1727204577.42797: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204577.42805: _execute() done 42613 1727204577.42808: dumping result to json 42613 1727204577.42811: done dumping result, returning 42613 1727204577.42822: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-2f91-05d8-00000000000c] 42613 1727204577.42830: sending task result for task 127b8e07-fff9-2f91-05d8-00000000000c 42613 1727204577.43175: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000000c 42613 1727204577.43180: WORKER PROCESS EXITING 42613 1727204577.43220: no more pending results, returning what we have 42613 1727204577.43226: in VariableManager get_vars() 42613 1727204577.43283: Calling all_inventory to load vars for managed-node3 42613 1727204577.43287: Calling groups_inventory to load vars for managed-node3 42613 1727204577.43291: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.43308: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.43312: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.43315: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.43655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.44199: done with get_vars() 42613 1727204577.44209: variable 'ansible_search_path' from source: unknown 42613 1727204577.44225: we have included files to process 42613 1727204577.44226: generating all_blocks data 42613 1727204577.44228: done generating all_blocks data 42613 1727204577.44228: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 42613 1727204577.44230: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 42613 1727204577.44232: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 42613 1727204577.44649: in VariableManager get_vars() 42613 1727204577.44674: done with get_vars() 42613 1727204577.45038: done processing included file 42613 1727204577.45041: iterating over new_blocks loaded from include file 42613 1727204577.45042: in VariableManager get_vars() 42613 1727204577.45060: done with get_vars() 42613 1727204577.45062: filtering new block on tags 42613 1727204577.45140: done filtering new block on tags 42613 1727204577.45143: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 42613 1727204577.45149: extending task lists for all hosts with included blocks 42613 1727204577.48349: done extending task lists 42613 1727204577.48352: done processing included files 42613 1727204577.48353: results queue empty 42613 1727204577.48353: checking for any_errors_fatal 42613 1727204577.48357: done checking for any_errors_fatal 42613 1727204577.48357: checking for max_fail_percentage 42613 1727204577.48359: done checking for max_fail_percentage 42613 1727204577.48360: checking to see if all hosts have failed and the running result is not ok 42613 1727204577.48361: done checking to see if all hosts have failed 42613 1727204577.48362: getting the remaining hosts for this loop 42613 1727204577.48363: done getting the remaining hosts for this loop 42613 1727204577.48368: getting the next task for host managed-node3 42613 1727204577.48372: done getting next task for host managed-node3 42613 1727204577.48375: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 42613 1727204577.48377: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204577.48380: getting variables 42613 1727204577.48381: in VariableManager get_vars() 42613 1727204577.48399: Calling all_inventory to load vars for managed-node3 42613 1727204577.48402: Calling groups_inventory to load vars for managed-node3 42613 1727204577.48404: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.48413: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.48416: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.48420: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.48945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.49587: done with get_vars() 42613 1727204577.49602: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:02:57 -0400 (0:00:00.117) 0:00:06.106 ***** 42613 1727204577.49811: entering _queue_task() for managed-node3/include_tasks 42613 1727204577.50630: worker is 1 (out of 1 available) 42613 1727204577.50645: exiting _queue_task() for managed-node3/include_tasks 42613 1727204577.50778: done queuing things up, now waiting for results queue to drain 42613 1727204577.50780: waiting for pending results... 42613 1727204577.51427: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 42613 1727204577.52355: in run() - task 127b8e07-fff9-2f91-05d8-000000000121 42613 1727204577.52367: variable 'ansible_search_path' from source: unknown 42613 1727204577.52379: variable 'ansible_search_path' from source: unknown 42613 1727204577.52735: calling self._execute() 42613 1727204577.52919: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.52926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.52934: variable 'omit' from source: magic vars 42613 1727204577.54036: variable 'ansible_distribution_major_version' from source: facts 42613 1727204577.54042: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204577.54044: _execute() done 42613 1727204577.54047: dumping result to json 42613 1727204577.54049: done dumping result, returning 42613 1727204577.54051: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-2f91-05d8-000000000121] 42613 1727204577.54053: sending task result for task 127b8e07-fff9-2f91-05d8-000000000121 42613 1727204577.54401: no more pending results, returning what we have 42613 1727204577.54407: in VariableManager get_vars() 42613 1727204577.54462: Calling all_inventory to load vars for managed-node3 42613 1727204577.54668: Calling groups_inventory to load vars for managed-node3 42613 1727204577.54672: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.54686: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.54689: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.54692: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.55220: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000121 42613 1727204577.55225: WORKER PROCESS EXITING 42613 1727204577.55252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.56046: done with get_vars() 42613 1727204577.56056: variable 'ansible_search_path' from source: unknown 42613 1727204577.56057: variable 'ansible_search_path' from source: unknown 42613 1727204577.56217: we have included files to process 42613 1727204577.56218: generating all_blocks data 42613 1727204577.56220: done generating all_blocks data 42613 1727204577.56221: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 42613 1727204577.56223: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 42613 1727204577.56225: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 42613 1727204577.57058: done processing included file 42613 1727204577.57061: iterating over new_blocks loaded from include file 42613 1727204577.57063: in VariableManager get_vars() 42613 1727204577.57118: done with get_vars() 42613 1727204577.57121: filtering new block on tags 42613 1727204577.57145: done filtering new block on tags 42613 1727204577.57148: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 42613 1727204577.57153: extending task lists for all hosts with included blocks 42613 1727204577.57485: done extending task lists 42613 1727204577.57487: done processing included files 42613 1727204577.57487: results queue empty 42613 1727204577.57488: checking for any_errors_fatal 42613 1727204577.57492: done checking for any_errors_fatal 42613 1727204577.57493: checking for max_fail_percentage 42613 1727204577.57495: done checking for max_fail_percentage 42613 1727204577.57496: checking to see if all hosts have failed and the running result is not ok 42613 1727204577.57497: done checking to see if all hosts have failed 42613 1727204577.57498: getting the remaining hosts for this loop 42613 1727204577.57499: done getting the remaining hosts for this loop 42613 1727204577.57503: getting the next task for host managed-node3 42613 1727204577.57507: done getting next task for host managed-node3 42613 1727204577.57510: ^ task is: TASK: Gather current interface info 42613 1727204577.57513: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204577.57517: getting variables 42613 1727204577.57518: in VariableManager get_vars() 42613 1727204577.57536: Calling all_inventory to load vars for managed-node3 42613 1727204577.57539: Calling groups_inventory to load vars for managed-node3 42613 1727204577.57542: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204577.57548: Calling all_plugins_play to load vars for managed-node3 42613 1727204577.57551: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204577.57554: Calling groups_plugins_play to load vars for managed-node3 42613 1727204577.58148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204577.58610: done with get_vars() 42613 1727204577.58625: done getting variables 42613 1727204577.58697: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:02:57 -0400 (0:00:00.089) 0:00:06.195 ***** 42613 1727204577.58731: entering _queue_task() for managed-node3/command 42613 1727204577.59920: worker is 1 (out of 1 available) 42613 1727204577.59931: exiting _queue_task() for managed-node3/command 42613 1727204577.59945: done queuing things up, now waiting for results queue to drain 42613 1727204577.59946: waiting for pending results... 42613 1727204577.60390: running TaskExecutor() for managed-node3/TASK: Gather current interface info 42613 1727204577.60472: in run() - task 127b8e07-fff9-2f91-05d8-0000000001b0 42613 1727204577.60885: variable 'ansible_search_path' from source: unknown 42613 1727204577.60889: variable 'ansible_search_path' from source: unknown 42613 1727204577.60892: calling self._execute() 42613 1727204577.61352: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.61357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.61361: variable 'omit' from source: magic vars 42613 1727204577.62641: variable 'ansible_distribution_major_version' from source: facts 42613 1727204577.62654: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204577.62663: variable 'omit' from source: magic vars 42613 1727204577.62718: variable 'omit' from source: magic vars 42613 1727204577.63100: variable 'omit' from source: magic vars 42613 1727204577.63147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204577.63310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204577.63331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204577.63354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204577.63368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204577.63758: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204577.63763: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.63768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.64019: Set connection var ansible_shell_executable to /bin/sh 42613 1727204577.64023: Set connection var ansible_pipelining to False 42613 1727204577.64026: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204577.64029: Set connection var ansible_connection to ssh 42613 1727204577.64032: Set connection var ansible_timeout to 10 42613 1727204577.64037: Set connection var ansible_shell_type to sh 42613 1727204577.64675: variable 'ansible_shell_executable' from source: unknown 42613 1727204577.64680: variable 'ansible_connection' from source: unknown 42613 1727204577.64683: variable 'ansible_module_compression' from source: unknown 42613 1727204577.64686: variable 'ansible_shell_type' from source: unknown 42613 1727204577.64688: variable 'ansible_shell_executable' from source: unknown 42613 1727204577.64691: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204577.64693: variable 'ansible_pipelining' from source: unknown 42613 1727204577.64696: variable 'ansible_timeout' from source: unknown 42613 1727204577.64698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204577.64751: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204577.64972: variable 'omit' from source: magic vars 42613 1727204577.64975: starting attempt loop 42613 1727204577.65383: running the handler 42613 1727204577.65400: _low_level_execute_command(): starting 42613 1727204577.65409: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204577.67498: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204577.67576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204577.68088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204577.69563: stdout chunk (state=3): >>>/root <<< 42613 1727204577.69873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204577.70087: stderr chunk (state=3): >>><<< 42613 1727204577.70091: stdout chunk (state=3): >>><<< 42613 1727204577.70119: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204577.70132: _low_level_execute_command(): starting 42613 1727204577.70139: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627 `" && echo ansible-tmp-1727204577.7011766-43065-261267420791627="` echo /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627 `" ) && sleep 0' 42613 1727204577.71448: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204577.71508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204577.71524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204577.71612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204577.71639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204577.71719: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204577.71825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204577.71849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204577.71959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204577.74452: stdout chunk (state=3): >>>ansible-tmp-1727204577.7011766-43065-261267420791627=/root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627 <<< 42613 1727204577.74459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204577.74604: stderr chunk (state=3): >>><<< 42613 1727204577.74609: stdout chunk (state=3): >>><<< 42613 1727204577.74873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204577.7011766-43065-261267420791627=/root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204577.74877: variable 'ansible_module_compression' from source: unknown 42613 1727204577.74976: ANSIBALLZ: Using generic lock for ansible.legacy.command 42613 1727204577.74979: ANSIBALLZ: Acquiring lock 42613 1727204577.74981: ANSIBALLZ: Lock acquired: 139982757271872 42613 1727204577.74983: ANSIBALLZ: Creating module 42613 1727204578.19654: ANSIBALLZ: Writing module into payload 42613 1727204578.19773: ANSIBALLZ: Writing module 42613 1727204578.19804: ANSIBALLZ: Renaming module 42613 1727204578.19815: ANSIBALLZ: Done creating module 42613 1727204578.19846: variable 'ansible_facts' from source: unknown 42613 1727204578.19929: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/AnsiballZ_command.py 42613 1727204578.20090: Sending initial data 42613 1727204578.20169: Sent initial data (156 bytes) 42613 1727204578.21312: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204578.21395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204578.21492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204578.21593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204578.21697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204578.23503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204578.23587: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204578.23656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp6e3xw_om /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/AnsiballZ_command.py <<< 42613 1727204578.23669: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/AnsiballZ_command.py" <<< 42613 1727204578.23730: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp6e3xw_om" to remote "/root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/AnsiballZ_command.py" <<< 42613 1727204578.25992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204578.26027: stderr chunk (state=3): >>><<< 42613 1727204578.26040: stdout chunk (state=3): >>><<< 42613 1727204578.26106: done transferring module to remote 42613 1727204578.26124: _low_level_execute_command(): starting 42613 1727204578.26357: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/ /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/AnsiballZ_command.py && sleep 0' 42613 1727204578.27813: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204578.27891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204578.27904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204578.27992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204578.28005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204578.28100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204578.30216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204578.30538: stderr chunk (state=3): >>><<< 42613 1727204578.30558: stdout chunk (state=3): >>><<< 42613 1727204578.30904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204578.30908: _low_level_execute_command(): starting 42613 1727204578.30911: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/AnsiballZ_command.py && sleep 0' 42613 1727204578.32392: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204578.32521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204578.32600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204578.32718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204578.32883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204578.51295: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:58.503985", "end": "2024-09-24 15:02:58.508033", "delta": "0:00:00.004048", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204578.52767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204578.52824: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 42613 1727204578.53017: stderr chunk (state=3): >>><<< 42613 1727204578.53020: stdout chunk (state=3): >>><<< 42613 1727204578.53275: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:58.503985", "end": "2024-09-24 15:02:58.508033", "delta": "0:00:00.004048", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204578.53279: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204578.53282: _low_level_execute_command(): starting 42613 1727204578.53284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204577.7011766-43065-261267420791627/ > /dev/null 2>&1 && sleep 0' 42613 1727204578.54921: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204578.55049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204578.55208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204578.55228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204578.55355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204578.57521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204578.57525: stdout chunk (state=3): >>><<< 42613 1727204578.57531: stderr chunk (state=3): >>><<< 42613 1727204578.57590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204578.57672: handler run complete 42613 1727204578.57675: Evaluated conditional (False): False 42613 1727204578.57678: attempt loop complete, returning result 42613 1727204578.57680: _execute() done 42613 1727204578.57682: dumping result to json 42613 1727204578.57684: done dumping result, returning 42613 1727204578.57686: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [127b8e07-fff9-2f91-05d8-0000000001b0] 42613 1727204578.57688: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001b0 42613 1727204578.57798: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001b0 42613 1727204578.57802: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004048", "end": "2024-09-24 15:02:58.508033", "rc": 0, "start": "2024-09-24 15:02:58.503985" } STDOUT: bonding_masters eth0 lo 42613 1727204578.57899: no more pending results, returning what we have 42613 1727204578.57903: results queue empty 42613 1727204578.57906: checking for any_errors_fatal 42613 1727204578.57908: done checking for any_errors_fatal 42613 1727204578.57908: checking for max_fail_percentage 42613 1727204578.57911: done checking for max_fail_percentage 42613 1727204578.57912: checking to see if all hosts have failed and the running result is not ok 42613 1727204578.57913: done checking to see if all hosts have failed 42613 1727204578.57914: getting the remaining hosts for this loop 42613 1727204578.57915: done getting the remaining hosts for this loop 42613 1727204578.57920: getting the next task for host managed-node3 42613 1727204578.57930: done getting next task for host managed-node3 42613 1727204578.57933: ^ task is: TASK: Set current_interfaces 42613 1727204578.57937: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204578.57940: getting variables 42613 1727204578.57943: in VariableManager get_vars() 42613 1727204578.58597: Calling all_inventory to load vars for managed-node3 42613 1727204578.58601: Calling groups_inventory to load vars for managed-node3 42613 1727204578.58603: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.58622: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.58626: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.58629: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.59363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.61353: done with get_vars() 42613 1727204578.61385: done getting variables 42613 1727204578.61455: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:02:58 -0400 (0:00:01.029) 0:00:07.224 ***** 42613 1727204578.61652: entering _queue_task() for managed-node3/set_fact 42613 1727204578.62581: worker is 1 (out of 1 available) 42613 1727204578.62600: exiting _queue_task() for managed-node3/set_fact 42613 1727204578.62613: done queuing things up, now waiting for results queue to drain 42613 1727204578.62615: waiting for pending results... 42613 1727204578.63289: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 42613 1727204578.63361: in run() - task 127b8e07-fff9-2f91-05d8-0000000001b1 42613 1727204578.63393: variable 'ansible_search_path' from source: unknown 42613 1727204578.63463: variable 'ansible_search_path' from source: unknown 42613 1727204578.63684: calling self._execute() 42613 1727204578.63916: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.63930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.63947: variable 'omit' from source: magic vars 42613 1727204578.64773: variable 'ansible_distribution_major_version' from source: facts 42613 1727204578.64795: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204578.64971: variable 'omit' from source: magic vars 42613 1727204578.64975: variable 'omit' from source: magic vars 42613 1727204578.65178: variable '_current_interfaces' from source: set_fact 42613 1727204578.65253: variable 'omit' from source: magic vars 42613 1727204578.65308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204578.65572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204578.65575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204578.65578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204578.65581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204578.65627: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204578.65974: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.65979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.65982: Set connection var ansible_shell_executable to /bin/sh 42613 1727204578.65985: Set connection var ansible_pipelining to False 42613 1727204578.65988: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204578.65991: Set connection var ansible_connection to ssh 42613 1727204578.65994: Set connection var ansible_timeout to 10 42613 1727204578.65996: Set connection var ansible_shell_type to sh 42613 1727204578.65999: variable 'ansible_shell_executable' from source: unknown 42613 1727204578.66001: variable 'ansible_connection' from source: unknown 42613 1727204578.66004: variable 'ansible_module_compression' from source: unknown 42613 1727204578.66006: variable 'ansible_shell_type' from source: unknown 42613 1727204578.66008: variable 'ansible_shell_executable' from source: unknown 42613 1727204578.66010: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.66013: variable 'ansible_pipelining' from source: unknown 42613 1727204578.66016: variable 'ansible_timeout' from source: unknown 42613 1727204578.66019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.66392: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204578.66411: variable 'omit' from source: magic vars 42613 1727204578.66756: starting attempt loop 42613 1727204578.66762: running the handler 42613 1727204578.66764: handler run complete 42613 1727204578.66771: attempt loop complete, returning result 42613 1727204578.66774: _execute() done 42613 1727204578.66778: dumping result to json 42613 1727204578.66781: done dumping result, returning 42613 1727204578.66785: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [127b8e07-fff9-2f91-05d8-0000000001b1] 42613 1727204578.66789: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001b1 ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 42613 1727204578.66931: no more pending results, returning what we have 42613 1727204578.66936: results queue empty 42613 1727204578.66937: checking for any_errors_fatal 42613 1727204578.66945: done checking for any_errors_fatal 42613 1727204578.66946: checking for max_fail_percentage 42613 1727204578.66948: done checking for max_fail_percentage 42613 1727204578.66949: checking to see if all hosts have failed and the running result is not ok 42613 1727204578.66949: done checking to see if all hosts have failed 42613 1727204578.66950: getting the remaining hosts for this loop 42613 1727204578.66951: done getting the remaining hosts for this loop 42613 1727204578.66956: getting the next task for host managed-node3 42613 1727204578.66964: done getting next task for host managed-node3 42613 1727204578.66969: ^ task is: TASK: Show current_interfaces 42613 1727204578.66972: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204578.66976: getting variables 42613 1727204578.66978: in VariableManager get_vars() 42613 1727204578.67017: Calling all_inventory to load vars for managed-node3 42613 1727204578.67020: Calling groups_inventory to load vars for managed-node3 42613 1727204578.67022: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.67038: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.67040: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.67044: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.67771: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001b1 42613 1727204578.67776: WORKER PROCESS EXITING 42613 1727204578.67999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.68616: done with get_vars() 42613 1727204578.68632: done getting variables 42613 1727204578.68854: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.073) 0:00:07.298 ***** 42613 1727204578.69012: entering _queue_task() for managed-node3/debug 42613 1727204578.69015: Creating lock for debug 42613 1727204578.69769: worker is 1 (out of 1 available) 42613 1727204578.69784: exiting _queue_task() for managed-node3/debug 42613 1727204578.69796: done queuing things up, now waiting for results queue to drain 42613 1727204578.69798: waiting for pending results... 42613 1727204578.70382: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 42613 1727204578.70693: in run() - task 127b8e07-fff9-2f91-05d8-000000000122 42613 1727204578.70718: variable 'ansible_search_path' from source: unknown 42613 1727204578.70733: variable 'ansible_search_path' from source: unknown 42613 1727204578.70782: calling self._execute() 42613 1727204578.71077: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.71091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.71106: variable 'omit' from source: magic vars 42613 1727204578.71979: variable 'ansible_distribution_major_version' from source: facts 42613 1727204578.72003: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204578.72019: variable 'omit' from source: magic vars 42613 1727204578.72492: variable 'omit' from source: magic vars 42613 1727204578.72669: variable 'current_interfaces' from source: set_fact 42613 1727204578.72871: variable 'omit' from source: magic vars 42613 1727204578.72874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204578.72944: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204578.73056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204578.73082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204578.73100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204578.73172: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204578.73355: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.73359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.73503: Set connection var ansible_shell_executable to /bin/sh 42613 1727204578.73514: Set connection var ansible_pipelining to False 42613 1727204578.73586: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204578.73597: Set connection var ansible_connection to ssh 42613 1727204578.73608: Set connection var ansible_timeout to 10 42613 1727204578.73617: Set connection var ansible_shell_type to sh 42613 1727204578.73650: variable 'ansible_shell_executable' from source: unknown 42613 1727204578.73974: variable 'ansible_connection' from source: unknown 42613 1727204578.73978: variable 'ansible_module_compression' from source: unknown 42613 1727204578.73981: variable 'ansible_shell_type' from source: unknown 42613 1727204578.73983: variable 'ansible_shell_executable' from source: unknown 42613 1727204578.73985: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.74008: variable 'ansible_pipelining' from source: unknown 42613 1727204578.74011: variable 'ansible_timeout' from source: unknown 42613 1727204578.74013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.74144: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204578.74370: variable 'omit' from source: magic vars 42613 1727204578.74374: starting attempt loop 42613 1727204578.74376: running the handler 42613 1727204578.74397: handler run complete 42613 1727204578.74425: attempt loop complete, returning result 42613 1727204578.74432: _execute() done 42613 1727204578.74439: dumping result to json 42613 1727204578.74446: done dumping result, returning 42613 1727204578.74531: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [127b8e07-fff9-2f91-05d8-000000000122] 42613 1727204578.74543: sending task result for task 127b8e07-fff9-2f91-05d8-000000000122 ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 42613 1727204578.74725: no more pending results, returning what we have 42613 1727204578.74728: results queue empty 42613 1727204578.74729: checking for any_errors_fatal 42613 1727204578.74738: done checking for any_errors_fatal 42613 1727204578.74739: checking for max_fail_percentage 42613 1727204578.74742: done checking for max_fail_percentage 42613 1727204578.74743: checking to see if all hosts have failed and the running result is not ok 42613 1727204578.74744: done checking to see if all hosts have failed 42613 1727204578.74745: getting the remaining hosts for this loop 42613 1727204578.74747: done getting the remaining hosts for this loop 42613 1727204578.74753: getting the next task for host managed-node3 42613 1727204578.74762: done getting next task for host managed-node3 42613 1727204578.74880: ^ task is: TASK: Include the task 'manage_test_interface.yml' 42613 1727204578.74883: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204578.74887: getting variables 42613 1727204578.74889: in VariableManager get_vars() 42613 1727204578.74976: Calling all_inventory to load vars for managed-node3 42613 1727204578.74979: Calling groups_inventory to load vars for managed-node3 42613 1727204578.74982: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.74996: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.74999: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.75003: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.75272: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000122 42613 1727204578.75276: WORKER PROCESS EXITING 42613 1727204578.75313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.75546: done with get_vars() 42613 1727204578.75558: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:16 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.066) 0:00:07.364 ***** 42613 1727204578.75669: entering _queue_task() for managed-node3/include_tasks 42613 1727204578.76036: worker is 1 (out of 1 available) 42613 1727204578.76051: exiting _queue_task() for managed-node3/include_tasks 42613 1727204578.76068: done queuing things up, now waiting for results queue to drain 42613 1727204578.76070: waiting for pending results... 42613 1727204578.76319: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 42613 1727204578.76434: in run() - task 127b8e07-fff9-2f91-05d8-00000000000d 42613 1727204578.76456: variable 'ansible_search_path' from source: unknown 42613 1727204578.76505: calling self._execute() 42613 1727204578.76613: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.76625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.76697: variable 'omit' from source: magic vars 42613 1727204578.77073: variable 'ansible_distribution_major_version' from source: facts 42613 1727204578.77094: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204578.77105: _execute() done 42613 1727204578.77114: dumping result to json 42613 1727204578.77122: done dumping result, returning 42613 1727204578.77141: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [127b8e07-fff9-2f91-05d8-00000000000d] 42613 1727204578.77152: sending task result for task 127b8e07-fff9-2f91-05d8-00000000000d 42613 1727204578.77393: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000000d 42613 1727204578.77397: WORKER PROCESS EXITING 42613 1727204578.77437: no more pending results, returning what we have 42613 1727204578.77443: in VariableManager get_vars() 42613 1727204578.77515: Calling all_inventory to load vars for managed-node3 42613 1727204578.77524: Calling groups_inventory to load vars for managed-node3 42613 1727204578.77527: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.77548: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.77552: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.77556: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.78190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.78651: done with get_vars() 42613 1727204578.78670: variable 'ansible_search_path' from source: unknown 42613 1727204578.78688: we have included files to process 42613 1727204578.78689: generating all_blocks data 42613 1727204578.78691: done generating all_blocks data 42613 1727204578.78695: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 42613 1727204578.78697: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 42613 1727204578.78699: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 42613 1727204578.79337: in VariableManager get_vars() 42613 1727204578.79359: done with get_vars() 42613 1727204578.79594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 42613 1727204578.80214: done processing included file 42613 1727204578.80217: iterating over new_blocks loaded from include file 42613 1727204578.80219: in VariableManager get_vars() 42613 1727204578.80238: done with get_vars() 42613 1727204578.80240: filtering new block on tags 42613 1727204578.80278: done filtering new block on tags 42613 1727204578.80282: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 42613 1727204578.80293: extending task lists for all hosts with included blocks 42613 1727204578.81814: done extending task lists 42613 1727204578.81817: done processing included files 42613 1727204578.81817: results queue empty 42613 1727204578.81818: checking for any_errors_fatal 42613 1727204578.81822: done checking for any_errors_fatal 42613 1727204578.81823: checking for max_fail_percentage 42613 1727204578.81824: done checking for max_fail_percentage 42613 1727204578.81825: checking to see if all hosts have failed and the running result is not ok 42613 1727204578.81826: done checking to see if all hosts have failed 42613 1727204578.81826: getting the remaining hosts for this loop 42613 1727204578.81828: done getting the remaining hosts for this loop 42613 1727204578.81831: getting the next task for host managed-node3 42613 1727204578.81835: done getting next task for host managed-node3 42613 1727204578.81837: ^ task is: TASK: Ensure state in ["present", "absent"] 42613 1727204578.81840: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204578.81842: getting variables 42613 1727204578.81843: in VariableManager get_vars() 42613 1727204578.81861: Calling all_inventory to load vars for managed-node3 42613 1727204578.81863: Calling groups_inventory to load vars for managed-node3 42613 1727204578.81867: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.81874: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.81877: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.81879: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.82091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.82340: done with get_vars() 42613 1727204578.82360: done getting variables 42613 1727204578.82450: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.068) 0:00:07.433 ***** 42613 1727204578.82493: entering _queue_task() for managed-node3/fail 42613 1727204578.82495: Creating lock for fail 42613 1727204578.82920: worker is 1 (out of 1 available) 42613 1727204578.82935: exiting _queue_task() for managed-node3/fail 42613 1727204578.82949: done queuing things up, now waiting for results queue to drain 42613 1727204578.82951: waiting for pending results... 42613 1727204578.83391: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 42613 1727204578.83397: in run() - task 127b8e07-fff9-2f91-05d8-0000000001cc 42613 1727204578.83400: variable 'ansible_search_path' from source: unknown 42613 1727204578.83404: variable 'ansible_search_path' from source: unknown 42613 1727204578.83412: calling self._execute() 42613 1727204578.83540: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.83551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.83563: variable 'omit' from source: magic vars 42613 1727204578.84343: variable 'ansible_distribution_major_version' from source: facts 42613 1727204578.84356: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204578.85035: variable 'state' from source: include params 42613 1727204578.85043: Evaluated conditional (state not in ["present", "absent"]): False 42613 1727204578.85046: when evaluation is False, skipping this task 42613 1727204578.85049: _execute() done 42613 1727204578.85052: dumping result to json 42613 1727204578.85054: done dumping result, returning 42613 1727204578.85119: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [127b8e07-fff9-2f91-05d8-0000000001cc] 42613 1727204578.85127: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001cc 42613 1727204578.85227: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001cc 42613 1727204578.85344: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 42613 1727204578.85405: no more pending results, returning what we have 42613 1727204578.85410: results queue empty 42613 1727204578.85411: checking for any_errors_fatal 42613 1727204578.85412: done checking for any_errors_fatal 42613 1727204578.85413: checking for max_fail_percentage 42613 1727204578.85415: done checking for max_fail_percentage 42613 1727204578.85416: checking to see if all hosts have failed and the running result is not ok 42613 1727204578.85417: done checking to see if all hosts have failed 42613 1727204578.85417: getting the remaining hosts for this loop 42613 1727204578.85419: done getting the remaining hosts for this loop 42613 1727204578.85423: getting the next task for host managed-node3 42613 1727204578.85430: done getting next task for host managed-node3 42613 1727204578.85432: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 42613 1727204578.85436: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204578.85440: getting variables 42613 1727204578.85441: in VariableManager get_vars() 42613 1727204578.85490: Calling all_inventory to load vars for managed-node3 42613 1727204578.85494: Calling groups_inventory to load vars for managed-node3 42613 1727204578.85497: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.85514: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.85517: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.85520: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.85930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.86622: done with get_vars() 42613 1727204578.86639: done getting variables 42613 1727204578.86709: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.042) 0:00:07.475 ***** 42613 1727204578.86744: entering _queue_task() for managed-node3/fail 42613 1727204578.87519: worker is 1 (out of 1 available) 42613 1727204578.87532: exiting _queue_task() for managed-node3/fail 42613 1727204578.87548: done queuing things up, now waiting for results queue to drain 42613 1727204578.87549: waiting for pending results... 42613 1727204578.88321: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 42613 1727204578.88636: in run() - task 127b8e07-fff9-2f91-05d8-0000000001cd 42613 1727204578.88968: variable 'ansible_search_path' from source: unknown 42613 1727204578.88973: variable 'ansible_search_path' from source: unknown 42613 1727204578.88976: calling self._execute() 42613 1727204578.89051: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.89314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.89318: variable 'omit' from source: magic vars 42613 1727204578.90751: variable 'ansible_distribution_major_version' from source: facts 42613 1727204578.90778: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204578.91340: variable 'type' from source: set_fact 42613 1727204578.91354: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 42613 1727204578.91362: when evaluation is False, skipping this task 42613 1727204578.91389: _execute() done 42613 1727204578.91419: dumping result to json 42613 1727204578.91460: done dumping result, returning 42613 1727204578.91503: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [127b8e07-fff9-2f91-05d8-0000000001cd] 42613 1727204578.91522: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001cd 42613 1727204578.91979: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001cd 42613 1727204578.91984: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 42613 1727204578.92155: no more pending results, returning what we have 42613 1727204578.92158: results queue empty 42613 1727204578.92159: checking for any_errors_fatal 42613 1727204578.92299: done checking for any_errors_fatal 42613 1727204578.92301: checking for max_fail_percentage 42613 1727204578.92304: done checking for max_fail_percentage 42613 1727204578.92305: checking to see if all hosts have failed and the running result is not ok 42613 1727204578.92306: done checking to see if all hosts have failed 42613 1727204578.92306: getting the remaining hosts for this loop 42613 1727204578.92308: done getting the remaining hosts for this loop 42613 1727204578.92313: getting the next task for host managed-node3 42613 1727204578.92321: done getting next task for host managed-node3 42613 1727204578.92324: ^ task is: TASK: Include the task 'show_interfaces.yml' 42613 1727204578.92328: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204578.92333: getting variables 42613 1727204578.92334: in VariableManager get_vars() 42613 1727204578.92459: Calling all_inventory to load vars for managed-node3 42613 1727204578.92463: Calling groups_inventory to load vars for managed-node3 42613 1727204578.92571: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.92587: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.92590: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.92594: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.93138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.93611: done with get_vars() 42613 1727204578.93627: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.069) 0:00:07.545 ***** 42613 1727204578.93731: entering _queue_task() for managed-node3/include_tasks 42613 1727204578.94987: worker is 1 (out of 1 available) 42613 1727204578.94999: exiting _queue_task() for managed-node3/include_tasks 42613 1727204578.95010: done queuing things up, now waiting for results queue to drain 42613 1727204578.95011: waiting for pending results... 42613 1727204578.95548: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 42613 1727204578.95643: in run() - task 127b8e07-fff9-2f91-05d8-0000000001ce 42613 1727204578.95648: variable 'ansible_search_path' from source: unknown 42613 1727204578.95652: variable 'ansible_search_path' from source: unknown 42613 1727204578.95700: calling self._execute() 42613 1727204578.95995: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204578.96092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204578.96098: variable 'omit' from source: magic vars 42613 1727204578.97126: variable 'ansible_distribution_major_version' from source: facts 42613 1727204578.97131: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204578.97136: _execute() done 42613 1727204578.97139: dumping result to json 42613 1727204578.97142: done dumping result, returning 42613 1727204578.97144: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-2f91-05d8-0000000001ce] 42613 1727204578.97146: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001ce 42613 1727204578.97491: no more pending results, returning what we have 42613 1727204578.97498: in VariableManager get_vars() 42613 1727204578.97547: Calling all_inventory to load vars for managed-node3 42613 1727204578.97551: Calling groups_inventory to load vars for managed-node3 42613 1727204578.97553: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204578.97562: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001ce 42613 1727204578.97569: WORKER PROCESS EXITING 42613 1727204578.97586: Calling all_plugins_play to load vars for managed-node3 42613 1727204578.97590: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204578.97593: Calling groups_plugins_play to load vars for managed-node3 42613 1727204578.98148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204578.98896: done with get_vars() 42613 1727204578.98907: variable 'ansible_search_path' from source: unknown 42613 1727204578.98908: variable 'ansible_search_path' from source: unknown 42613 1727204578.98951: we have included files to process 42613 1727204578.98953: generating all_blocks data 42613 1727204578.98954: done generating all_blocks data 42613 1727204578.98960: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 42613 1727204578.98962: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 42613 1727204578.98964: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 42613 1727204578.99391: in VariableManager get_vars() 42613 1727204578.99419: done with get_vars() 42613 1727204578.99890: done processing included file 42613 1727204578.99893: iterating over new_blocks loaded from include file 42613 1727204578.99895: in VariableManager get_vars() 42613 1727204578.99917: done with get_vars() 42613 1727204578.99919: filtering new block on tags 42613 1727204578.99940: done filtering new block on tags 42613 1727204578.99943: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 42613 1727204578.99949: extending task lists for all hosts with included blocks 42613 1727204579.00931: done extending task lists 42613 1727204579.00934: done processing included files 42613 1727204579.00934: results queue empty 42613 1727204579.00935: checking for any_errors_fatal 42613 1727204579.00938: done checking for any_errors_fatal 42613 1727204579.00939: checking for max_fail_percentage 42613 1727204579.00941: done checking for max_fail_percentage 42613 1727204579.00942: checking to see if all hosts have failed and the running result is not ok 42613 1727204579.00942: done checking to see if all hosts have failed 42613 1727204579.00943: getting the remaining hosts for this loop 42613 1727204579.00945: done getting the remaining hosts for this loop 42613 1727204579.00948: getting the next task for host managed-node3 42613 1727204579.00952: done getting next task for host managed-node3 42613 1727204579.00955: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 42613 1727204579.00958: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204579.00961: getting variables 42613 1727204579.00962: in VariableManager get_vars() 42613 1727204579.01375: Calling all_inventory to load vars for managed-node3 42613 1727204579.01379: Calling groups_inventory to load vars for managed-node3 42613 1727204579.01381: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204579.01388: Calling all_plugins_play to load vars for managed-node3 42613 1727204579.01391: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204579.01395: Calling groups_plugins_play to load vars for managed-node3 42613 1727204579.01595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204579.02105: done with get_vars() 42613 1727204579.02118: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.086) 0:00:07.632 ***** 42613 1727204579.02399: entering _queue_task() for managed-node3/include_tasks 42613 1727204579.03185: worker is 1 (out of 1 available) 42613 1727204579.03196: exiting _queue_task() for managed-node3/include_tasks 42613 1727204579.03207: done queuing things up, now waiting for results queue to drain 42613 1727204579.03209: waiting for pending results... 42613 1727204579.03431: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 42613 1727204579.03635: in run() - task 127b8e07-fff9-2f91-05d8-000000000275 42613 1727204579.03641: variable 'ansible_search_path' from source: unknown 42613 1727204579.03644: variable 'ansible_search_path' from source: unknown 42613 1727204579.03649: calling self._execute() 42613 1727204579.03886: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.03890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.03894: variable 'omit' from source: magic vars 42613 1727204579.04209: variable 'ansible_distribution_major_version' from source: facts 42613 1727204579.04213: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204579.04216: _execute() done 42613 1727204579.04218: dumping result to json 42613 1727204579.04221: done dumping result, returning 42613 1727204579.04224: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-2f91-05d8-000000000275] 42613 1727204579.04226: sending task result for task 127b8e07-fff9-2f91-05d8-000000000275 42613 1727204579.04532: no more pending results, returning what we have 42613 1727204579.04539: in VariableManager get_vars() 42613 1727204579.04642: Calling all_inventory to load vars for managed-node3 42613 1727204579.04645: Calling groups_inventory to load vars for managed-node3 42613 1727204579.04647: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204579.04663: Calling all_plugins_play to load vars for managed-node3 42613 1727204579.04669: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204579.04673: Calling groups_plugins_play to load vars for managed-node3 42613 1727204579.05277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204579.05699: done with get_vars() 42613 1727204579.05709: variable 'ansible_search_path' from source: unknown 42613 1727204579.05711: variable 'ansible_search_path' from source: unknown 42613 1727204579.05877: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000275 42613 1727204579.05884: WORKER PROCESS EXITING 42613 1727204579.05936: we have included files to process 42613 1727204579.05938: generating all_blocks data 42613 1727204579.05939: done generating all_blocks data 42613 1727204579.05941: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 42613 1727204579.05942: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 42613 1727204579.05944: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 42613 1727204579.06681: done processing included file 42613 1727204579.06684: iterating over new_blocks loaded from include file 42613 1727204579.06686: in VariableManager get_vars() 42613 1727204579.06709: done with get_vars() 42613 1727204579.06711: filtering new block on tags 42613 1727204579.06732: done filtering new block on tags 42613 1727204579.06735: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 42613 1727204579.06741: extending task lists for all hosts with included blocks 42613 1727204579.07114: done extending task lists 42613 1727204579.07115: done processing included files 42613 1727204579.07116: results queue empty 42613 1727204579.07117: checking for any_errors_fatal 42613 1727204579.07120: done checking for any_errors_fatal 42613 1727204579.07121: checking for max_fail_percentage 42613 1727204579.07123: done checking for max_fail_percentage 42613 1727204579.07123: checking to see if all hosts have failed and the running result is not ok 42613 1727204579.07124: done checking to see if all hosts have failed 42613 1727204579.07125: getting the remaining hosts for this loop 42613 1727204579.07126: done getting the remaining hosts for this loop 42613 1727204579.07129: getting the next task for host managed-node3 42613 1727204579.07134: done getting next task for host managed-node3 42613 1727204579.07137: ^ task is: TASK: Gather current interface info 42613 1727204579.07141: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204579.07144: getting variables 42613 1727204579.07145: in VariableManager get_vars() 42613 1727204579.07159: Calling all_inventory to load vars for managed-node3 42613 1727204579.07162: Calling groups_inventory to load vars for managed-node3 42613 1727204579.07164: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204579.07172: Calling all_plugins_play to load vars for managed-node3 42613 1727204579.07467: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204579.07475: Calling groups_plugins_play to load vars for managed-node3 42613 1727204579.07802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204579.08362: done with get_vars() 42613 1727204579.08482: done getting variables 42613 1727204579.08533: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.061) 0:00:07.693 ***** 42613 1727204579.08572: entering _queue_task() for managed-node3/command 42613 1727204579.09272: worker is 1 (out of 1 available) 42613 1727204579.09288: exiting _queue_task() for managed-node3/command 42613 1727204579.09301: done queuing things up, now waiting for results queue to drain 42613 1727204579.09303: waiting for pending results... 42613 1727204579.10068: running TaskExecutor() for managed-node3/TASK: Gather current interface info 42613 1727204579.10607: in run() - task 127b8e07-fff9-2f91-05d8-0000000002ac 42613 1727204579.10635: variable 'ansible_search_path' from source: unknown 42613 1727204579.10972: variable 'ansible_search_path' from source: unknown 42613 1727204579.10988: calling self._execute() 42613 1727204579.11274: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.11278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.11281: variable 'omit' from source: magic vars 42613 1727204579.12491: variable 'ansible_distribution_major_version' from source: facts 42613 1727204579.12503: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204579.12514: variable 'omit' from source: magic vars 42613 1727204579.12704: variable 'omit' from source: magic vars 42613 1727204579.12747: variable 'omit' from source: magic vars 42613 1727204579.12917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204579.12961: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204579.13077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204579.13110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.13163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.13169: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204579.13172: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.13174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.13518: Set connection var ansible_shell_executable to /bin/sh 42613 1727204579.13642: Set connection var ansible_pipelining to False 42613 1727204579.13744: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204579.13748: Set connection var ansible_connection to ssh 42613 1727204579.13750: Set connection var ansible_timeout to 10 42613 1727204579.13752: Set connection var ansible_shell_type to sh 42613 1727204579.13754: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.13756: variable 'ansible_connection' from source: unknown 42613 1727204579.13870: variable 'ansible_module_compression' from source: unknown 42613 1727204579.13874: variable 'ansible_shell_type' from source: unknown 42613 1727204579.13877: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.13879: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.13882: variable 'ansible_pipelining' from source: unknown 42613 1727204579.13884: variable 'ansible_timeout' from source: unknown 42613 1727204579.13886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.14157: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204579.14207: variable 'omit' from source: magic vars 42613 1727204579.14217: starting attempt loop 42613 1727204579.14294: running the handler 42613 1727204579.14314: _low_level_execute_command(): starting 42613 1727204579.14327: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204579.16486: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204579.16552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204579.16614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204579.16808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204579.17005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204579.18781: stdout chunk (state=3): >>>/root <<< 42613 1727204579.18889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204579.19032: stderr chunk (state=3): >>><<< 42613 1727204579.19038: stdout chunk (state=3): >>><<< 42613 1727204579.19067: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204579.19170: _low_level_execute_command(): starting 42613 1727204579.19175: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274 `" && echo ansible-tmp-1727204579.1912076-43261-93240875773274="` echo /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274 `" ) && sleep 0' 42613 1727204579.20422: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204579.20479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204579.20505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204579.20583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204579.20615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204579.20644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204579.20664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204579.20773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204579.23231: stdout chunk (state=3): >>>ansible-tmp-1727204579.1912076-43261-93240875773274=/root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274 <<< 42613 1727204579.23236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204579.23240: stdout chunk (state=3): >>><<< 42613 1727204579.23243: stderr chunk (state=3): >>><<< 42613 1727204579.23264: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204579.1912076-43261-93240875773274=/root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204579.23673: variable 'ansible_module_compression' from source: unknown 42613 1727204579.23676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204579.23678: variable 'ansible_facts' from source: unknown 42613 1727204579.23978: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/AnsiballZ_command.py 42613 1727204579.24127: Sending initial data 42613 1727204579.24138: Sent initial data (155 bytes) 42613 1727204579.25390: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204579.25421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204579.25445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204579.25461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204579.25570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204579.27362: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204579.27450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204579.27777: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpaoyq1v3e /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/AnsiballZ_command.py <<< 42613 1727204579.27781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/AnsiballZ_command.py" <<< 42613 1727204579.27869: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpaoyq1v3e" to remote "/root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/AnsiballZ_command.py" <<< 42613 1727204579.29296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204579.29343: stderr chunk (state=3): >>><<< 42613 1727204579.29357: stdout chunk (state=3): >>><<< 42613 1727204579.29392: done transferring module to remote 42613 1727204579.29417: _low_level_execute_command(): starting 42613 1727204579.29427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/ /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/AnsiballZ_command.py && sleep 0' 42613 1727204579.30090: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204579.30105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204579.30122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204579.30143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204579.30255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204579.30284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204579.30304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204579.30410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204579.32548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204579.32673: stderr chunk (state=3): >>><<< 42613 1727204579.32677: stdout chunk (state=3): >>><<< 42613 1727204579.32695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204579.32698: _low_level_execute_command(): starting 42613 1727204579.32705: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/AnsiballZ_command.py && sleep 0' 42613 1727204579.34227: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204579.34328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204579.34473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204579.34529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204579.34735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204579.34782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204579.34909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204579.53073: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:59.525500", "end": "2024-09-24 15:02:59.529433", "delta": "0:00:00.003933", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204579.55579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204579.55583: stderr chunk (state=3): >>><<< 42613 1727204579.55586: stdout chunk (state=3): >>><<< 42613 1727204579.55588: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:59.525500", "end": "2024-09-24 15:02:59.529433", "delta": "0:00:00.003933", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204579.55591: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204579.55593: _low_level_execute_command(): starting 42613 1727204579.55595: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204579.1912076-43261-93240875773274/ > /dev/null 2>&1 && sleep 0' 42613 1727204579.57978: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204579.58253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204579.58398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204579.58812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204579.58911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204579.61475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204579.61479: stdout chunk (state=3): >>><<< 42613 1727204579.61482: stderr chunk (state=3): >>><<< 42613 1727204579.61484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204579.61487: handler run complete 42613 1727204579.61489: Evaluated conditional (False): False 42613 1727204579.61491: attempt loop complete, returning result 42613 1727204579.61493: _execute() done 42613 1727204579.61495: dumping result to json 42613 1727204579.61497: done dumping result, returning 42613 1727204579.61499: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [127b8e07-fff9-2f91-05d8-0000000002ac] 42613 1727204579.61501: sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ac 42613 1727204579.61583: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ac 42613 1727204579.61587: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003933", "end": "2024-09-24 15:02:59.529433", "rc": 0, "start": "2024-09-24 15:02:59.525500" } STDOUT: bonding_masters eth0 lo 42613 1727204579.61755: no more pending results, returning what we have 42613 1727204579.61759: results queue empty 42613 1727204579.61760: checking for any_errors_fatal 42613 1727204579.61761: done checking for any_errors_fatal 42613 1727204579.61762: checking for max_fail_percentage 42613 1727204579.61764: done checking for max_fail_percentage 42613 1727204579.61768: checking to see if all hosts have failed and the running result is not ok 42613 1727204579.61769: done checking to see if all hosts have failed 42613 1727204579.61769: getting the remaining hosts for this loop 42613 1727204579.61771: done getting the remaining hosts for this loop 42613 1727204579.61776: getting the next task for host managed-node3 42613 1727204579.61785: done getting next task for host managed-node3 42613 1727204579.61789: ^ task is: TASK: Set current_interfaces 42613 1727204579.61794: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204579.61799: getting variables 42613 1727204579.61801: in VariableManager get_vars() 42613 1727204579.61848: Calling all_inventory to load vars for managed-node3 42613 1727204579.61851: Calling groups_inventory to load vars for managed-node3 42613 1727204579.61853: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204579.62251: Calling all_plugins_play to load vars for managed-node3 42613 1727204579.62257: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204579.62262: Calling groups_plugins_play to load vars for managed-node3 42613 1727204579.63138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204579.64628: done with get_vars() 42613 1727204579.64647: done getting variables 42613 1727204579.65023: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.564) 0:00:08.258 ***** 42613 1727204579.65068: entering _queue_task() for managed-node3/set_fact 42613 1727204579.66216: worker is 1 (out of 1 available) 42613 1727204579.66231: exiting _queue_task() for managed-node3/set_fact 42613 1727204579.66242: done queuing things up, now waiting for results queue to drain 42613 1727204579.66244: waiting for pending results... 42613 1727204579.66768: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 42613 1727204579.66957: in run() - task 127b8e07-fff9-2f91-05d8-0000000002ad 42613 1727204579.67023: variable 'ansible_search_path' from source: unknown 42613 1727204579.67108: variable 'ansible_search_path' from source: unknown 42613 1727204579.67274: calling self._execute() 42613 1727204579.67354: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.67392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.67457: variable 'omit' from source: magic vars 42613 1727204579.68413: variable 'ansible_distribution_major_version' from source: facts 42613 1727204579.68650: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204579.68655: variable 'omit' from source: magic vars 42613 1727204579.68657: variable 'omit' from source: magic vars 42613 1727204579.69086: variable '_current_interfaces' from source: set_fact 42613 1727204579.69091: variable 'omit' from source: magic vars 42613 1727204579.69209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204579.69256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204579.69328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204579.69433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.69451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.69492: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204579.69526: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.69535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.69825: Set connection var ansible_shell_executable to /bin/sh 42613 1727204579.69837: Set connection var ansible_pipelining to False 42613 1727204579.69860: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204579.70067: Set connection var ansible_connection to ssh 42613 1727204579.70072: Set connection var ansible_timeout to 10 42613 1727204579.70074: Set connection var ansible_shell_type to sh 42613 1727204579.70077: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.70080: variable 'ansible_connection' from source: unknown 42613 1727204579.70082: variable 'ansible_module_compression' from source: unknown 42613 1727204579.70084: variable 'ansible_shell_type' from source: unknown 42613 1727204579.70086: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.70088: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.70090: variable 'ansible_pipelining' from source: unknown 42613 1727204579.70091: variable 'ansible_timeout' from source: unknown 42613 1727204579.70093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.70414: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204579.70571: variable 'omit' from source: magic vars 42613 1727204579.70574: starting attempt loop 42613 1727204579.70577: running the handler 42613 1727204579.70580: handler run complete 42613 1727204579.70582: attempt loop complete, returning result 42613 1727204579.70584: _execute() done 42613 1727204579.70586: dumping result to json 42613 1727204579.70589: done dumping result, returning 42613 1727204579.70591: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [127b8e07-fff9-2f91-05d8-0000000002ad] 42613 1727204579.70829: sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ad 42613 1727204579.70910: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ad 42613 1727204579.70913: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 42613 1727204579.71003: no more pending results, returning what we have 42613 1727204579.71007: results queue empty 42613 1727204579.71008: checking for any_errors_fatal 42613 1727204579.71015: done checking for any_errors_fatal 42613 1727204579.71016: checking for max_fail_percentage 42613 1727204579.71019: done checking for max_fail_percentage 42613 1727204579.71020: checking to see if all hosts have failed and the running result is not ok 42613 1727204579.71021: done checking to see if all hosts have failed 42613 1727204579.71022: getting the remaining hosts for this loop 42613 1727204579.71024: done getting the remaining hosts for this loop 42613 1727204579.71029: getting the next task for host managed-node3 42613 1727204579.71042: done getting next task for host managed-node3 42613 1727204579.71045: ^ task is: TASK: Show current_interfaces 42613 1727204579.71050: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204579.71055: getting variables 42613 1727204579.71057: in VariableManager get_vars() 42613 1727204579.71099: Calling all_inventory to load vars for managed-node3 42613 1727204579.71102: Calling groups_inventory to load vars for managed-node3 42613 1727204579.71105: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204579.71117: Calling all_plugins_play to load vars for managed-node3 42613 1727204579.71120: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204579.71123: Calling groups_plugins_play to load vars for managed-node3 42613 1727204579.71799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204579.72754: done with get_vars() 42613 1727204579.72977: done getting variables 42613 1727204579.73043: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.080) 0:00:08.338 ***** 42613 1727204579.73081: entering _queue_task() for managed-node3/debug 42613 1727204579.73847: worker is 1 (out of 1 available) 42613 1727204579.73863: exiting _queue_task() for managed-node3/debug 42613 1727204579.73881: done queuing things up, now waiting for results queue to drain 42613 1727204579.73883: waiting for pending results... 42613 1727204579.74527: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 42613 1727204579.74731: in run() - task 127b8e07-fff9-2f91-05d8-000000000276 42613 1727204579.74782: variable 'ansible_search_path' from source: unknown 42613 1727204579.74989: variable 'ansible_search_path' from source: unknown 42613 1727204579.74994: calling self._execute() 42613 1727204579.75118: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.75188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.75273: variable 'omit' from source: magic vars 42613 1727204579.76087: variable 'ansible_distribution_major_version' from source: facts 42613 1727204579.76099: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204579.76108: variable 'omit' from source: magic vars 42613 1727204579.76293: variable 'omit' from source: magic vars 42613 1727204579.76673: variable 'current_interfaces' from source: set_fact 42613 1727204579.76679: variable 'omit' from source: magic vars 42613 1727204579.76682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204579.76686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204579.76688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204579.76785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.76789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.76792: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204579.76794: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.76796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.77037: Set connection var ansible_shell_executable to /bin/sh 42613 1727204579.77041: Set connection var ansible_pipelining to False 42613 1727204579.77101: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204579.77104: Set connection var ansible_connection to ssh 42613 1727204579.77108: Set connection var ansible_timeout to 10 42613 1727204579.77110: Set connection var ansible_shell_type to sh 42613 1727204579.77112: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.77115: variable 'ansible_connection' from source: unknown 42613 1727204579.77117: variable 'ansible_module_compression' from source: unknown 42613 1727204579.77119: variable 'ansible_shell_type' from source: unknown 42613 1727204579.77121: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.77123: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.77126: variable 'ansible_pipelining' from source: unknown 42613 1727204579.77128: variable 'ansible_timeout' from source: unknown 42613 1727204579.77130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.77479: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204579.77494: variable 'omit' from source: magic vars 42613 1727204579.77499: starting attempt loop 42613 1727204579.77502: running the handler 42613 1727204579.77553: handler run complete 42613 1727204579.77569: attempt loop complete, returning result 42613 1727204579.77778: _execute() done 42613 1727204579.77781: dumping result to json 42613 1727204579.77784: done dumping result, returning 42613 1727204579.77794: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [127b8e07-fff9-2f91-05d8-000000000276] 42613 1727204579.77799: sending task result for task 127b8e07-fff9-2f91-05d8-000000000276 42613 1727204579.78019: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000276 42613 1727204579.78024: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 42613 1727204579.78076: no more pending results, returning what we have 42613 1727204579.78079: results queue empty 42613 1727204579.78080: checking for any_errors_fatal 42613 1727204579.78085: done checking for any_errors_fatal 42613 1727204579.78086: checking for max_fail_percentage 42613 1727204579.78088: done checking for max_fail_percentage 42613 1727204579.78089: checking to see if all hosts have failed and the running result is not ok 42613 1727204579.78090: done checking to see if all hosts have failed 42613 1727204579.78090: getting the remaining hosts for this loop 42613 1727204579.78092: done getting the remaining hosts for this loop 42613 1727204579.78097: getting the next task for host managed-node3 42613 1727204579.78105: done getting next task for host managed-node3 42613 1727204579.78108: ^ task is: TASK: Install iproute 42613 1727204579.78112: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204579.78116: getting variables 42613 1727204579.78118: in VariableManager get_vars() 42613 1727204579.78158: Calling all_inventory to load vars for managed-node3 42613 1727204579.78160: Calling groups_inventory to load vars for managed-node3 42613 1727204579.78163: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204579.78177: Calling all_plugins_play to load vars for managed-node3 42613 1727204579.78180: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204579.78183: Calling groups_plugins_play to load vars for managed-node3 42613 1727204579.78873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204579.79346: done with get_vars() 42613 1727204579.79470: done getting variables 42613 1727204579.79716: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.066) 0:00:08.405 ***** 42613 1727204579.79757: entering _queue_task() for managed-node3/package 42613 1727204579.80591: worker is 1 (out of 1 available) 42613 1727204579.80607: exiting _queue_task() for managed-node3/package 42613 1727204579.80622: done queuing things up, now waiting for results queue to drain 42613 1727204579.80623: waiting for pending results... 42613 1727204579.81407: running TaskExecutor() for managed-node3/TASK: Install iproute 42613 1727204579.81744: in run() - task 127b8e07-fff9-2f91-05d8-0000000001cf 42613 1727204579.81750: variable 'ansible_search_path' from source: unknown 42613 1727204579.81754: variable 'ansible_search_path' from source: unknown 42613 1727204579.81757: calling self._execute() 42613 1727204579.81974: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.81979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.81981: variable 'omit' from source: magic vars 42613 1727204579.83060: variable 'ansible_distribution_major_version' from source: facts 42613 1727204579.83067: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204579.83071: variable 'omit' from source: magic vars 42613 1727204579.83189: variable 'omit' from source: magic vars 42613 1727204579.83662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204579.91786: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204579.92099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204579.92386: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204579.92584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204579.92588: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204579.92763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204579.93006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204579.93146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204579.93198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204579.93443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204579.93894: variable '__network_is_ostree' from source: set_fact 42613 1727204579.93898: variable 'omit' from source: magic vars 42613 1727204579.93901: variable 'omit' from source: magic vars 42613 1727204579.94073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204579.94077: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204579.94079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204579.94082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.94335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204579.94339: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204579.94341: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.94344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.94803: Set connection var ansible_shell_executable to /bin/sh 42613 1727204579.94807: Set connection var ansible_pipelining to False 42613 1727204579.94809: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204579.94812: Set connection var ansible_connection to ssh 42613 1727204579.94814: Set connection var ansible_timeout to 10 42613 1727204579.94816: Set connection var ansible_shell_type to sh 42613 1727204579.94921: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.94931: variable 'ansible_connection' from source: unknown 42613 1727204579.94942: variable 'ansible_module_compression' from source: unknown 42613 1727204579.94952: variable 'ansible_shell_type' from source: unknown 42613 1727204579.94959: variable 'ansible_shell_executable' from source: unknown 42613 1727204579.94970: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204579.95071: variable 'ansible_pipelining' from source: unknown 42613 1727204579.95075: variable 'ansible_timeout' from source: unknown 42613 1727204579.95077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204579.95542: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204579.95546: variable 'omit' from source: magic vars 42613 1727204579.95549: starting attempt loop 42613 1727204579.95550: running the handler 42613 1727204579.95552: variable 'ansible_facts' from source: unknown 42613 1727204579.95554: variable 'ansible_facts' from source: unknown 42613 1727204579.95783: _low_level_execute_command(): starting 42613 1727204579.95790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204579.97253: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204579.97386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204579.97484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204579.97708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204579.99555: stdout chunk (state=3): >>>/root <<< 42613 1727204579.99770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204579.99774: stdout chunk (state=3): >>><<< 42613 1727204579.99777: stderr chunk (state=3): >>><<< 42613 1727204579.99862: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204579.99995: _low_level_execute_command(): starting 42613 1727204579.99999: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735 `" && echo ansible-tmp-1727204579.9987953-43331-44032934229735="` echo /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735 `" ) && sleep 0' 42613 1727204580.01642: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204580.01791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204580.01869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204580.01916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204580.01995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204580.02263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204580.04336: stdout chunk (state=3): >>>ansible-tmp-1727204579.9987953-43331-44032934229735=/root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735 <<< 42613 1727204580.04745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204580.04749: stdout chunk (state=3): >>><<< 42613 1727204580.04752: stderr chunk (state=3): >>><<< 42613 1727204580.04755: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204579.9987953-43331-44032934229735=/root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204580.04757: variable 'ansible_module_compression' from source: unknown 42613 1727204580.04806: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 42613 1727204580.04816: ANSIBALLZ: Acquiring lock 42613 1727204580.04824: ANSIBALLZ: Lock acquired: 139982757271872 42613 1727204580.04836: ANSIBALLZ: Creating module 42613 1727204580.50479: ANSIBALLZ: Writing module into payload 42613 1727204580.51239: ANSIBALLZ: Writing module 42613 1727204580.51265: ANSIBALLZ: Renaming module 42613 1727204580.51521: ANSIBALLZ: Done creating module 42613 1727204580.51538: variable 'ansible_facts' from source: unknown 42613 1727204580.51740: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/AnsiballZ_dnf.py 42613 1727204580.52228: Sending initial data 42613 1727204580.52231: Sent initial data (151 bytes) 42613 1727204580.54344: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204580.54349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204580.54352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204580.54685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204580.54783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204580.56840: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204580.57185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp_jxgshta /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/AnsiballZ_dnf.py <<< 42613 1727204580.57190: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp_jxgshta" to remote "/root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/AnsiballZ_dnf.py" <<< 42613 1727204580.59209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204580.59385: stderr chunk (state=3): >>><<< 42613 1727204580.59391: stdout chunk (state=3): >>><<< 42613 1727204580.59393: done transferring module to remote 42613 1727204580.59395: _low_level_execute_command(): starting 42613 1727204580.59398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/ /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/AnsiballZ_dnf.py && sleep 0' 42613 1727204580.60922: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204580.61087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204580.61177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204580.61189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204580.61204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204580.61367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204580.63492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204580.63678: stderr chunk (state=3): >>><<< 42613 1727204580.63682: stdout chunk (state=3): >>><<< 42613 1727204580.63870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204580.63877: _low_level_execute_command(): starting 42613 1727204580.63881: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/AnsiballZ_dnf.py && sleep 0' 42613 1727204580.65391: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204580.65518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204580.65560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204580.65672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204580.65936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204581.87841: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 42613 1727204581.93167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204581.93232: stderr chunk (state=3): >>><<< 42613 1727204581.93236: stdout chunk (state=3): >>><<< 42613 1727204581.93254: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204581.93294: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204581.93304: _low_level_execute_command(): starting 42613 1727204581.93307: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204579.9987953-43331-44032934229735/ > /dev/null 2>&1 && sleep 0' 42613 1727204581.93813: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204581.93817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204581.93820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204581.93822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204581.93873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204581.93882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204581.93897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204581.93968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204581.96032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204581.96091: stderr chunk (state=3): >>><<< 42613 1727204581.96094: stdout chunk (state=3): >>><<< 42613 1727204581.96109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204581.96119: handler run complete 42613 1727204581.96252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204581.96392: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204581.96425: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204581.96452: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204581.96479: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204581.96536: variable '__install_status' from source: unknown 42613 1727204581.96553: Evaluated conditional (__install_status is success): True 42613 1727204581.96567: attempt loop complete, returning result 42613 1727204581.96570: _execute() done 42613 1727204581.96573: dumping result to json 42613 1727204581.96579: done dumping result, returning 42613 1727204581.96587: done running TaskExecutor() for managed-node3/TASK: Install iproute [127b8e07-fff9-2f91-05d8-0000000001cf] 42613 1727204581.96592: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001cf 42613 1727204581.96700: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001cf 42613 1727204581.96703: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 42613 1727204581.96793: no more pending results, returning what we have 42613 1727204581.96796: results queue empty 42613 1727204581.96797: checking for any_errors_fatal 42613 1727204581.96803: done checking for any_errors_fatal 42613 1727204581.96803: checking for max_fail_percentage 42613 1727204581.96805: done checking for max_fail_percentage 42613 1727204581.96806: checking to see if all hosts have failed and the running result is not ok 42613 1727204581.96807: done checking to see if all hosts have failed 42613 1727204581.96808: getting the remaining hosts for this loop 42613 1727204581.96810: done getting the remaining hosts for this loop 42613 1727204581.96813: getting the next task for host managed-node3 42613 1727204581.96821: done getting next task for host managed-node3 42613 1727204581.96823: ^ task is: TASK: Create veth interface {{ interface }} 42613 1727204581.96826: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204581.96830: getting variables 42613 1727204581.96831: in VariableManager get_vars() 42613 1727204581.96877: Calling all_inventory to load vars for managed-node3 42613 1727204581.96880: Calling groups_inventory to load vars for managed-node3 42613 1727204581.96883: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204581.96894: Calling all_plugins_play to load vars for managed-node3 42613 1727204581.96896: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204581.96899: Calling groups_plugins_play to load vars for managed-node3 42613 1727204581.97100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204581.97243: done with get_vars() 42613 1727204581.97253: done getting variables 42613 1727204581.97302: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204581.97400: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:03:01 -0400 (0:00:02.176) 0:00:10.582 ***** 42613 1727204581.97426: entering _queue_task() for managed-node3/command 42613 1727204581.97671: worker is 1 (out of 1 available) 42613 1727204581.97686: exiting _queue_task() for managed-node3/command 42613 1727204581.97700: done queuing things up, now waiting for results queue to drain 42613 1727204581.97702: waiting for pending results... 42613 1727204581.97884: running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 42613 1727204581.97962: in run() - task 127b8e07-fff9-2f91-05d8-0000000001d0 42613 1727204581.97977: variable 'ansible_search_path' from source: unknown 42613 1727204581.97981: variable 'ansible_search_path' from source: unknown 42613 1727204581.98219: variable 'interface' from source: set_fact 42613 1727204581.98289: variable 'interface' from source: set_fact 42613 1727204581.98344: variable 'interface' from source: set_fact 42613 1727204581.98475: Loaded config def from plugin (lookup/items) 42613 1727204581.98484: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 42613 1727204581.98504: variable 'omit' from source: magic vars 42613 1727204581.98607: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204581.98615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204581.98623: variable 'omit' from source: magic vars 42613 1727204581.98878: variable 'ansible_distribution_major_version' from source: facts 42613 1727204581.98885: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204581.99026: variable 'type' from source: set_fact 42613 1727204581.99030: variable 'state' from source: include params 42613 1727204581.99033: variable 'interface' from source: set_fact 42613 1727204581.99042: variable 'current_interfaces' from source: set_fact 42613 1727204581.99048: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 42613 1727204581.99055: variable 'omit' from source: magic vars 42613 1727204581.99087: variable 'omit' from source: magic vars 42613 1727204581.99122: variable 'item' from source: unknown 42613 1727204581.99180: variable 'item' from source: unknown 42613 1727204581.99194: variable 'omit' from source: magic vars 42613 1727204581.99222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204581.99254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204581.99271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204581.99286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204581.99295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204581.99320: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204581.99323: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204581.99326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204581.99409: Set connection var ansible_shell_executable to /bin/sh 42613 1727204581.99415: Set connection var ansible_pipelining to False 42613 1727204581.99423: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204581.99426: Set connection var ansible_connection to ssh 42613 1727204581.99431: Set connection var ansible_timeout to 10 42613 1727204581.99434: Set connection var ansible_shell_type to sh 42613 1727204581.99458: variable 'ansible_shell_executable' from source: unknown 42613 1727204581.99461: variable 'ansible_connection' from source: unknown 42613 1727204581.99463: variable 'ansible_module_compression' from source: unknown 42613 1727204581.99469: variable 'ansible_shell_type' from source: unknown 42613 1727204581.99471: variable 'ansible_shell_executable' from source: unknown 42613 1727204581.99474: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204581.99478: variable 'ansible_pipelining' from source: unknown 42613 1727204581.99480: variable 'ansible_timeout' from source: unknown 42613 1727204581.99482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204581.99595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204581.99604: variable 'omit' from source: magic vars 42613 1727204581.99609: starting attempt loop 42613 1727204581.99612: running the handler 42613 1727204581.99626: _low_level_execute_command(): starting 42613 1727204581.99633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204582.00205: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204582.00210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204582.00214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.00272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.00276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.00353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.02212: stdout chunk (state=3): >>>/root <<< 42613 1727204582.02318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.02384: stderr chunk (state=3): >>><<< 42613 1727204582.02390: stdout chunk (state=3): >>><<< 42613 1727204582.02410: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.02425: _low_level_execute_command(): starting 42613 1727204582.02431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261 `" && echo ansible-tmp-1727204582.0240932-43500-63995048885261="` echo /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261 `" ) && sleep 0' 42613 1727204582.02941: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204582.02945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.02947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204582.02949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.03004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.03007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.03010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.03086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.05281: stdout chunk (state=3): >>>ansible-tmp-1727204582.0240932-43500-63995048885261=/root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261 <<< 42613 1727204582.05400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.05464: stderr chunk (state=3): >>><<< 42613 1727204582.05470: stdout chunk (state=3): >>><<< 42613 1727204582.05486: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204582.0240932-43500-63995048885261=/root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.05516: variable 'ansible_module_compression' from source: unknown 42613 1727204582.05562: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204582.05593: variable 'ansible_facts' from source: unknown 42613 1727204582.05654: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/AnsiballZ_command.py 42613 1727204582.05770: Sending initial data 42613 1727204582.05774: Sent initial data (155 bytes) 42613 1727204582.06259: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204582.06264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.06283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.06342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.06346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.06353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.06424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.08210: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204582.08277: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204582.08344: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpt9u94w79 /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/AnsiballZ_command.py <<< 42613 1727204582.08351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/AnsiballZ_command.py" <<< 42613 1727204582.08417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpt9u94w79" to remote "/root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/AnsiballZ_command.py" <<< 42613 1727204582.08420: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/AnsiballZ_command.py" <<< 42613 1727204582.09074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.09154: stderr chunk (state=3): >>><<< 42613 1727204582.09157: stdout chunk (state=3): >>><<< 42613 1727204582.09180: done transferring module to remote 42613 1727204582.09193: _low_level_execute_command(): starting 42613 1727204582.09198: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/ /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/AnsiballZ_command.py && sleep 0' 42613 1727204582.09709: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204582.09713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.09720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204582.09722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.09773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.09790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.09793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.09860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.11876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.11937: stderr chunk (state=3): >>><<< 42613 1727204582.11941: stdout chunk (state=3): >>><<< 42613 1727204582.11952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.11956: _low_level_execute_command(): starting 42613 1727204582.11961: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/AnsiballZ_command.py && sleep 0' 42613 1727204582.12686: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.12705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.12725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.12848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.31281: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:03:02.303067", "end": "2024-09-24 15:03:02.309927", "delta": "0:00:00.006860", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204582.34240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204582.34313: stderr chunk (state=3): >>><<< 42613 1727204582.34317: stdout chunk (state=3): >>><<< 42613 1727204582.34331: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:03:02.303067", "end": "2024-09-24 15:03:02.309927", "delta": "0:00:00.006860", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204582.34363: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204582.34372: _low_level_execute_command(): starting 42613 1727204582.34377: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204582.0240932-43500-63995048885261/ > /dev/null 2>&1 && sleep 0' 42613 1727204582.34871: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204582.34881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204582.34906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.34910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204582.34913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.34978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.34981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.34984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.35179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.41070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.41122: stderr chunk (state=3): >>><<< 42613 1727204582.41127: stdout chunk (state=3): >>><<< 42613 1727204582.41141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.41147: handler run complete 42613 1727204582.41167: Evaluated conditional (False): False 42613 1727204582.41180: attempt loop complete, returning result 42613 1727204582.41197: variable 'item' from source: unknown 42613 1727204582.41286: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.006860", "end": "2024-09-24 15:03:02.309927", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 15:03:02.303067" } 42613 1727204582.41491: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204582.41500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204582.41503: variable 'omit' from source: magic vars 42613 1727204582.41586: variable 'ansible_distribution_major_version' from source: facts 42613 1727204582.41589: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204582.41749: variable 'type' from source: set_fact 42613 1727204582.41752: variable 'state' from source: include params 42613 1727204582.41755: variable 'interface' from source: set_fact 42613 1727204582.41757: variable 'current_interfaces' from source: set_fact 42613 1727204582.41786: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 42613 1727204582.41790: variable 'omit' from source: magic vars 42613 1727204582.41795: variable 'omit' from source: magic vars 42613 1727204582.41926: variable 'item' from source: unknown 42613 1727204582.41930: variable 'item' from source: unknown 42613 1727204582.41944: variable 'omit' from source: magic vars 42613 1727204582.41972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204582.41982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204582.41989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204582.42007: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204582.42024: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204582.42027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204582.42177: Set connection var ansible_shell_executable to /bin/sh 42613 1727204582.42181: Set connection var ansible_pipelining to False 42613 1727204582.42184: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204582.42186: Set connection var ansible_connection to ssh 42613 1727204582.42189: Set connection var ansible_timeout to 10 42613 1727204582.42191: Set connection var ansible_shell_type to sh 42613 1727204582.42193: variable 'ansible_shell_executable' from source: unknown 42613 1727204582.42195: variable 'ansible_connection' from source: unknown 42613 1727204582.42198: variable 'ansible_module_compression' from source: unknown 42613 1727204582.42200: variable 'ansible_shell_type' from source: unknown 42613 1727204582.42202: variable 'ansible_shell_executable' from source: unknown 42613 1727204582.42204: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204582.42207: variable 'ansible_pipelining' from source: unknown 42613 1727204582.42209: variable 'ansible_timeout' from source: unknown 42613 1727204582.42211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204582.42350: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204582.42362: variable 'omit' from source: magic vars 42613 1727204582.42368: starting attempt loop 42613 1727204582.42371: running the handler 42613 1727204582.42380: _low_level_execute_command(): starting 42613 1727204582.42384: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204582.43225: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204582.43253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.43298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.43339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.43409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.45249: stdout chunk (state=3): >>>/root <<< 42613 1727204582.45388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.45413: stderr chunk (state=3): >>><<< 42613 1727204582.45418: stdout chunk (state=3): >>><<< 42613 1727204582.45432: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.45443: _low_level_execute_command(): starting 42613 1727204582.45448: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640 `" && echo ansible-tmp-1727204582.454319-43500-136344580028640="` echo /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640 `" ) && sleep 0' 42613 1727204582.45961: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204582.45968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.45971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204582.45973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.46021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.46028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.46031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.46103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.48261: stdout chunk (state=3): >>>ansible-tmp-1727204582.454319-43500-136344580028640=/root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640 <<< 42613 1727204582.48456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.48494: stderr chunk (state=3): >>><<< 42613 1727204582.48498: stdout chunk (state=3): >>><<< 42613 1727204582.48542: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204582.454319-43500-136344580028640=/root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.48546: variable 'ansible_module_compression' from source: unknown 42613 1727204582.48584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204582.48677: variable 'ansible_facts' from source: unknown 42613 1727204582.48680: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/AnsiballZ_command.py 42613 1727204582.48924: Sending initial data 42613 1727204582.48927: Sent initial data (155 bytes) 42613 1727204582.49607: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.49691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.49698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.49702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.49804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.51617: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204582.51701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204582.51785: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp5kqib3xk /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/AnsiballZ_command.py <<< 42613 1727204582.51789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/AnsiballZ_command.py" <<< 42613 1727204582.51858: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp5kqib3xk" to remote "/root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/AnsiballZ_command.py" <<< 42613 1727204582.52796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.52886: stderr chunk (state=3): >>><<< 42613 1727204582.52901: stdout chunk (state=3): >>><<< 42613 1727204582.53030: done transferring module to remote 42613 1727204582.53037: _low_level_execute_command(): starting 42613 1727204582.53040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/ /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/AnsiballZ_command.py && sleep 0' 42613 1727204582.53774: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.53832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.53909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.56075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.56079: stdout chunk (state=3): >>><<< 42613 1727204582.56082: stderr chunk (state=3): >>><<< 42613 1727204582.56085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.56087: _low_level_execute_command(): starting 42613 1727204582.56089: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/AnsiballZ_command.py && sleep 0' 42613 1727204582.56877: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204582.56882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.56928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.56969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.57010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.57110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.75042: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:03:02.744941", "end": "2024-09-24 15:03:02.749131", "delta": "0:00:00.004190", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204582.76908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204582.76950: stderr chunk (state=3): >>><<< 42613 1727204582.76971: stdout chunk (state=3): >>><<< 42613 1727204582.76996: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:03:02.744941", "end": "2024-09-24 15:03:02.749131", "delta": "0:00:00.004190", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204582.77053: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204582.77124: _low_level_execute_command(): starting 42613 1727204582.77127: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204582.454319-43500-136344580028640/ > /dev/null 2>&1 && sleep 0' 42613 1727204582.77807: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204582.77929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.77975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.78086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.80379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.80383: stdout chunk (state=3): >>><<< 42613 1727204582.80386: stderr chunk (state=3): >>><<< 42613 1727204582.80389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.80391: handler run complete 42613 1727204582.80393: Evaluated conditional (False): False 42613 1727204582.80395: attempt loop complete, returning result 42613 1727204582.80397: variable 'item' from source: unknown 42613 1727204582.80417: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.004190", "end": "2024-09-24 15:03:02.749131", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 15:03:02.744941" } 42613 1727204582.80812: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204582.80816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204582.80819: variable 'omit' from source: magic vars 42613 1727204582.81006: variable 'ansible_distribution_major_version' from source: facts 42613 1727204582.81572: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204582.81736: variable 'type' from source: set_fact 42613 1727204582.81814: variable 'state' from source: include params 42613 1727204582.81823: variable 'interface' from source: set_fact 42613 1727204582.81831: variable 'current_interfaces' from source: set_fact 42613 1727204582.81844: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 42613 1727204582.81854: variable 'omit' from source: magic vars 42613 1727204582.81934: variable 'omit' from source: magic vars 42613 1727204582.81987: variable 'item' from source: unknown 42613 1727204582.82350: variable 'item' from source: unknown 42613 1727204582.82354: variable 'omit' from source: magic vars 42613 1727204582.82356: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204582.82359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204582.82361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204582.82572: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204582.82575: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204582.82578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204582.82680: Set connection var ansible_shell_executable to /bin/sh 42613 1727204582.82785: Set connection var ansible_pipelining to False 42613 1727204582.82788: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204582.82792: Set connection var ansible_connection to ssh 42613 1727204582.82794: Set connection var ansible_timeout to 10 42613 1727204582.82796: Set connection var ansible_shell_type to sh 42613 1727204582.82798: variable 'ansible_shell_executable' from source: unknown 42613 1727204582.82800: variable 'ansible_connection' from source: unknown 42613 1727204582.82802: variable 'ansible_module_compression' from source: unknown 42613 1727204582.82804: variable 'ansible_shell_type' from source: unknown 42613 1727204582.82806: variable 'ansible_shell_executable' from source: unknown 42613 1727204582.82808: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204582.82810: variable 'ansible_pipelining' from source: unknown 42613 1727204582.82812: variable 'ansible_timeout' from source: unknown 42613 1727204582.82814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204582.83149: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204582.83152: variable 'omit' from source: magic vars 42613 1727204582.83154: starting attempt loop 42613 1727204582.83162: running the handler 42613 1727204582.83170: _low_level_execute_command(): starting 42613 1727204582.83179: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204582.84485: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204582.84490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.84797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.84906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.86736: stdout chunk (state=3): >>>/root <<< 42613 1727204582.87325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.87329: stderr chunk (state=3): >>><<< 42613 1727204582.87332: stdout chunk (state=3): >>><<< 42613 1727204582.87335: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.87337: _low_level_execute_command(): starting 42613 1727204582.87340: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218 `" && echo ansible-tmp-1727204582.8721735-43500-271312491735218="` echo /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218 `" ) && sleep 0' 42613 1727204582.88620: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204582.88625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204582.88627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204582.88630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.88690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.88694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.88719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.88822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.91044: stdout chunk (state=3): >>>ansible-tmp-1727204582.8721735-43500-271312491735218=/root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218 <<< 42613 1727204582.91217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.91464: stderr chunk (state=3): >>><<< 42613 1727204582.91470: stdout chunk (state=3): >>><<< 42613 1727204582.91482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204582.8721735-43500-271312491735218=/root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.91672: variable 'ansible_module_compression' from source: unknown 42613 1727204582.91677: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204582.91680: variable 'ansible_facts' from source: unknown 42613 1727204582.91973: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/AnsiballZ_command.py 42613 1727204582.92132: Sending initial data 42613 1727204582.92145: Sent initial data (156 bytes) 42613 1727204582.92895: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.92915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204582.92995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.93034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204582.93063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.93104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.93324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.94998: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204582.95104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204582.95206: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp85a_cufg /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/AnsiballZ_command.py <<< 42613 1727204582.95210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/AnsiballZ_command.py" <<< 42613 1727204582.95301: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp85a_cufg" to remote "/root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/AnsiballZ_command.py" <<< 42613 1727204582.96169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.96297: stderr chunk (state=3): >>><<< 42613 1727204582.96307: stdout chunk (state=3): >>><<< 42613 1727204582.96435: done transferring module to remote 42613 1727204582.96439: _low_level_execute_command(): starting 42613 1727204582.96441: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/ /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/AnsiballZ_command.py && sleep 0' 42613 1727204582.97087: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204582.97300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204582.97357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204582.97497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204582.99774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204582.99779: stdout chunk (state=3): >>><<< 42613 1727204582.99782: stderr chunk (state=3): >>><<< 42613 1727204582.99785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204582.99787: _low_level_execute_command(): starting 42613 1727204582.99790: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/AnsiballZ_command.py && sleep 0' 42613 1727204583.00489: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204583.00557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204583.00613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204583.00635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.00660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.00789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.18866: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:03:03.182579", "end": "2024-09-24 15:03:03.186851", "delta": "0:00:00.004272", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204583.20691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204583.20702: stdout chunk (state=3): >>><<< 42613 1727204583.20714: stderr chunk (state=3): >>><<< 42613 1727204583.20746: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:03:03.182579", "end": "2024-09-24 15:03:03.186851", "delta": "0:00:00.004272", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204583.20788: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204583.20800: _low_level_execute_command(): starting 42613 1727204583.20809: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204582.8721735-43500-271312491735218/ > /dev/null 2>&1 && sleep 0' 42613 1727204583.21517: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204583.21535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204583.21552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204583.21579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204583.21686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204583.21703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.21723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.21823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.23964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204583.23970: stdout chunk (state=3): >>><<< 42613 1727204583.23975: stderr chunk (state=3): >>><<< 42613 1727204583.24005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204583.24009: handler run complete 42613 1727204583.24035: Evaluated conditional (False): False 42613 1727204583.24043: attempt loop complete, returning result 42613 1727204583.24071: variable 'item' from source: unknown 42613 1727204583.24272: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004272", "end": "2024-09-24 15:03:03.186851", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 15:03:03.182579" } 42613 1727204583.24601: dumping result to json 42613 1727204583.24606: done dumping result, returning 42613 1727204583.24608: done running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000001d0] 42613 1727204583.24611: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d0 42613 1727204583.24738: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d0 42613 1727204583.24742: WORKER PROCESS EXITING 42613 1727204583.24821: no more pending results, returning what we have 42613 1727204583.24824: results queue empty 42613 1727204583.24825: checking for any_errors_fatal 42613 1727204583.24829: done checking for any_errors_fatal 42613 1727204583.24830: checking for max_fail_percentage 42613 1727204583.24831: done checking for max_fail_percentage 42613 1727204583.24832: checking to see if all hosts have failed and the running result is not ok 42613 1727204583.24833: done checking to see if all hosts have failed 42613 1727204583.24833: getting the remaining hosts for this loop 42613 1727204583.24835: done getting the remaining hosts for this loop 42613 1727204583.24838: getting the next task for host managed-node3 42613 1727204583.24843: done getting next task for host managed-node3 42613 1727204583.24845: ^ task is: TASK: Set up veth as managed by NetworkManager 42613 1727204583.24848: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204583.24863: getting variables 42613 1727204583.24865: in VariableManager get_vars() 42613 1727204583.24895: Calling all_inventory to load vars for managed-node3 42613 1727204583.24899: Calling groups_inventory to load vars for managed-node3 42613 1727204583.24901: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204583.24912: Calling all_plugins_play to load vars for managed-node3 42613 1727204583.24914: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204583.24917: Calling groups_plugins_play to load vars for managed-node3 42613 1727204583.25417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204583.25652: done with get_vars() 42613 1727204583.25664: done getting variables 42613 1727204583.25725: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:03:03 -0400 (0:00:01.283) 0:00:11.865 ***** 42613 1727204583.25764: entering _queue_task() for managed-node3/command 42613 1727204583.26171: worker is 1 (out of 1 available) 42613 1727204583.26303: exiting _queue_task() for managed-node3/command 42613 1727204583.26316: done queuing things up, now waiting for results queue to drain 42613 1727204583.26318: waiting for pending results... 42613 1727204583.26748: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 42613 1727204583.26755: in run() - task 127b8e07-fff9-2f91-05d8-0000000001d1 42613 1727204583.26759: variable 'ansible_search_path' from source: unknown 42613 1727204583.26762: variable 'ansible_search_path' from source: unknown 42613 1727204583.26767: calling self._execute() 42613 1727204583.26863: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.26879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.26953: variable 'omit' from source: magic vars 42613 1727204583.27336: variable 'ansible_distribution_major_version' from source: facts 42613 1727204583.27356: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204583.27725: variable 'type' from source: set_fact 42613 1727204583.27736: variable 'state' from source: include params 42613 1727204583.27749: Evaluated conditional (type == 'veth' and state == 'present'): True 42613 1727204583.27760: variable 'omit' from source: magic vars 42613 1727204583.28278: variable 'omit' from source: magic vars 42613 1727204583.28283: variable 'interface' from source: set_fact 42613 1727204583.28285: variable 'omit' from source: magic vars 42613 1727204583.28288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204583.28517: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204583.28549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204583.28578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204583.28598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204583.28662: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204583.28728: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.28742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.29066: Set connection var ansible_shell_executable to /bin/sh 42613 1727204583.29081: Set connection var ansible_pipelining to False 42613 1727204583.29093: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204583.29173: Set connection var ansible_connection to ssh 42613 1727204583.29177: Set connection var ansible_timeout to 10 42613 1727204583.29180: Set connection var ansible_shell_type to sh 42613 1727204583.29211: variable 'ansible_shell_executable' from source: unknown 42613 1727204583.29281: variable 'ansible_connection' from source: unknown 42613 1727204583.29285: variable 'ansible_module_compression' from source: unknown 42613 1727204583.29287: variable 'ansible_shell_type' from source: unknown 42613 1727204583.29291: variable 'ansible_shell_executable' from source: unknown 42613 1727204583.29298: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.29306: variable 'ansible_pipelining' from source: unknown 42613 1727204583.29312: variable 'ansible_timeout' from source: unknown 42613 1727204583.29319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.29803: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204583.29809: variable 'omit' from source: magic vars 42613 1727204583.29811: starting attempt loop 42613 1727204583.29813: running the handler 42613 1727204583.29815: _low_level_execute_command(): starting 42613 1727204583.29873: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204583.31403: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204583.31410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204583.31464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204583.31544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.31570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.31762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.33586: stdout chunk (state=3): >>>/root <<< 42613 1727204583.33820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204583.33828: stdout chunk (state=3): >>><<< 42613 1727204583.33834: stderr chunk (state=3): >>><<< 42613 1727204583.33913: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204583.33917: _low_level_execute_command(): starting 42613 1727204583.33926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716 `" && echo ansible-tmp-1727204583.3389127-43627-137553942429716="` echo /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716 `" ) && sleep 0' 42613 1727204583.34998: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204583.35003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204583.35104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204583.35108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204583.35112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204583.35114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.35155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.35230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.37391: stdout chunk (state=3): >>>ansible-tmp-1727204583.3389127-43627-137553942429716=/root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716 <<< 42613 1727204583.37687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204583.37691: stdout chunk (state=3): >>><<< 42613 1727204583.37694: stderr chunk (state=3): >>><<< 42613 1727204583.37697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204583.3389127-43627-137553942429716=/root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204583.37699: variable 'ansible_module_compression' from source: unknown 42613 1727204583.37751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204583.37787: variable 'ansible_facts' from source: unknown 42613 1727204583.37886: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/AnsiballZ_command.py 42613 1727204583.38102: Sending initial data 42613 1727204583.38105: Sent initial data (156 bytes) 42613 1727204583.39051: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.39069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.39172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.40947: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204583.41011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204583.41109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpfmeh3azp /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/AnsiballZ_command.py <<< 42613 1727204583.41112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/AnsiballZ_command.py" <<< 42613 1727204583.41177: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpfmeh3azp" to remote "/root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/AnsiballZ_command.py" <<< 42613 1727204583.42578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204583.42583: stderr chunk (state=3): >>><<< 42613 1727204583.42585: stdout chunk (state=3): >>><<< 42613 1727204583.42587: done transferring module to remote 42613 1727204583.42590: _low_level_execute_command(): starting 42613 1727204583.42592: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/ /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/AnsiballZ_command.py && sleep 0' 42613 1727204583.43797: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204583.44016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.44050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.44170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.46291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204583.46427: stderr chunk (state=3): >>><<< 42613 1727204583.46437: stdout chunk (state=3): >>><<< 42613 1727204583.46462: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204583.46474: _low_level_execute_command(): starting 42613 1727204583.46484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/AnsiballZ_command.py && sleep 0' 42613 1727204583.47843: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204583.48067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.48310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.48388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.68164: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:03:03.659411", "end": "2024-09-24 15:03:03.679426", "delta": "0:00:00.020015", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204583.70993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204583.71175: stderr chunk (state=3): >>><<< 42613 1727204583.71179: stdout chunk (state=3): >>><<< 42613 1727204583.71183: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:03:03.659411", "end": "2024-09-24 15:03:03.679426", "delta": "0:00:00.020015", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204583.71739: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204583.71763: _low_level_execute_command(): starting 42613 1727204583.71776: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204583.3389127-43627-137553942429716/ > /dev/null 2>&1 && sleep 0' 42613 1727204583.72536: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204583.72556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204583.72575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204583.72595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204583.72617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204583.72654: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204583.72740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204583.72772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204583.72796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204583.72895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204583.75243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204583.75248: stdout chunk (state=3): >>><<< 42613 1727204583.75251: stderr chunk (state=3): >>><<< 42613 1727204583.75280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204583.75293: handler run complete 42613 1727204583.75367: Evaluated conditional (False): False 42613 1727204583.75373: attempt loop complete, returning result 42613 1727204583.75376: _execute() done 42613 1727204583.75378: dumping result to json 42613 1727204583.75380: done dumping result, returning 42613 1727204583.75382: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [127b8e07-fff9-2f91-05d8-0000000001d1] 42613 1727204583.75474: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d1 42613 1727204583.75564: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d1 42613 1727204583.75570: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.020015", "end": "2024-09-24 15:03:03.679426", "rc": 0, "start": "2024-09-24 15:03:03.659411" } 42613 1727204583.75659: no more pending results, returning what we have 42613 1727204583.75662: results queue empty 42613 1727204583.75663: checking for any_errors_fatal 42613 1727204583.75802: done checking for any_errors_fatal 42613 1727204583.75804: checking for max_fail_percentage 42613 1727204583.75806: done checking for max_fail_percentage 42613 1727204583.75807: checking to see if all hosts have failed and the running result is not ok 42613 1727204583.75808: done checking to see if all hosts have failed 42613 1727204583.75809: getting the remaining hosts for this loop 42613 1727204583.75811: done getting the remaining hosts for this loop 42613 1727204583.75815: getting the next task for host managed-node3 42613 1727204583.75876: done getting next task for host managed-node3 42613 1727204583.75880: ^ task is: TASK: Delete veth interface {{ interface }} 42613 1727204583.75884: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204583.75888: getting variables 42613 1727204583.75890: in VariableManager get_vars() 42613 1727204583.76047: Calling all_inventory to load vars for managed-node3 42613 1727204583.76050: Calling groups_inventory to load vars for managed-node3 42613 1727204583.76053: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204583.76149: Calling all_plugins_play to load vars for managed-node3 42613 1727204583.76154: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204583.76159: Calling groups_plugins_play to load vars for managed-node3 42613 1727204583.76822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204583.78091: done with get_vars() 42613 1727204583.78105: done getting variables 42613 1727204583.78308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204583.78518: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.527) 0:00:12.393 ***** 42613 1727204583.78554: entering _queue_task() for managed-node3/command 42613 1727204583.79324: worker is 1 (out of 1 available) 42613 1727204583.79342: exiting _queue_task() for managed-node3/command 42613 1727204583.79355: done queuing things up, now waiting for results queue to drain 42613 1727204583.79356: waiting for pending results... 42613 1727204583.80086: running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 42613 1727204583.80092: in run() - task 127b8e07-fff9-2f91-05d8-0000000001d2 42613 1727204583.80095: variable 'ansible_search_path' from source: unknown 42613 1727204583.80103: variable 'ansible_search_path' from source: unknown 42613 1727204583.80272: calling self._execute() 42613 1727204583.80276: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.80279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.80281: variable 'omit' from source: magic vars 42613 1727204583.80691: variable 'ansible_distribution_major_version' from source: facts 42613 1727204583.80712: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204583.81264: variable 'type' from source: set_fact 42613 1727204583.81387: variable 'state' from source: include params 42613 1727204583.81399: variable 'interface' from source: set_fact 42613 1727204583.81672: variable 'current_interfaces' from source: set_fact 42613 1727204583.81676: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 42613 1727204583.81679: when evaluation is False, skipping this task 42613 1727204583.81682: _execute() done 42613 1727204583.81684: dumping result to json 42613 1727204583.81687: done dumping result, returning 42613 1727204583.81689: done running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000001d2] 42613 1727204583.81692: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d2 42613 1727204583.81780: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d2 42613 1727204583.81784: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 42613 1727204583.82022: no more pending results, returning what we have 42613 1727204583.82026: results queue empty 42613 1727204583.82027: checking for any_errors_fatal 42613 1727204583.82035: done checking for any_errors_fatal 42613 1727204583.82036: checking for max_fail_percentage 42613 1727204583.82038: done checking for max_fail_percentage 42613 1727204583.82039: checking to see if all hosts have failed and the running result is not ok 42613 1727204583.82040: done checking to see if all hosts have failed 42613 1727204583.82040: getting the remaining hosts for this loop 42613 1727204583.82042: done getting the remaining hosts for this loop 42613 1727204583.82047: getting the next task for host managed-node3 42613 1727204583.82053: done getting next task for host managed-node3 42613 1727204583.82056: ^ task is: TASK: Create dummy interface {{ interface }} 42613 1727204583.82060: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204583.82064: getting variables 42613 1727204583.82069: in VariableManager get_vars() 42613 1727204583.82113: Calling all_inventory to load vars for managed-node3 42613 1727204583.82117: Calling groups_inventory to load vars for managed-node3 42613 1727204583.82119: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204583.82134: Calling all_plugins_play to load vars for managed-node3 42613 1727204583.82137: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204583.82141: Calling groups_plugins_play to load vars for managed-node3 42613 1727204583.82626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204583.82930: done with get_vars() 42613 1727204583.82951: done getting variables 42613 1727204583.83037: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204583.83193: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.046) 0:00:12.440 ***** 42613 1727204583.83230: entering _queue_task() for managed-node3/command 42613 1727204583.83636: worker is 1 (out of 1 available) 42613 1727204583.83651: exiting _queue_task() for managed-node3/command 42613 1727204583.83670: done queuing things up, now waiting for results queue to drain 42613 1727204583.83672: waiting for pending results... 42613 1727204583.84084: running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 42613 1727204583.84115: in run() - task 127b8e07-fff9-2f91-05d8-0000000001d3 42613 1727204583.84138: variable 'ansible_search_path' from source: unknown 42613 1727204583.84212: variable 'ansible_search_path' from source: unknown 42613 1727204583.84217: calling self._execute() 42613 1727204583.84313: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.84326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.84342: variable 'omit' from source: magic vars 42613 1727204583.85492: variable 'ansible_distribution_major_version' from source: facts 42613 1727204583.85496: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204583.85752: variable 'type' from source: set_fact 42613 1727204583.85775: variable 'state' from source: include params 42613 1727204583.85799: variable 'interface' from source: set_fact 42613 1727204583.85831: variable 'current_interfaces' from source: set_fact 42613 1727204583.85858: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 42613 1727204583.85870: when evaluation is False, skipping this task 42613 1727204583.85932: _execute() done 42613 1727204583.85941: dumping result to json 42613 1727204583.85950: done dumping result, returning 42613 1727204583.85961: done running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000001d3] 42613 1727204583.86000: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d3 42613 1727204583.86362: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d3 42613 1727204583.86368: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 42613 1727204583.86451: no more pending results, returning what we have 42613 1727204583.86456: results queue empty 42613 1727204583.86457: checking for any_errors_fatal 42613 1727204583.86467: done checking for any_errors_fatal 42613 1727204583.86468: checking for max_fail_percentage 42613 1727204583.86471: done checking for max_fail_percentage 42613 1727204583.86472: checking to see if all hosts have failed and the running result is not ok 42613 1727204583.86473: done checking to see if all hosts have failed 42613 1727204583.86473: getting the remaining hosts for this loop 42613 1727204583.86476: done getting the remaining hosts for this loop 42613 1727204583.86481: getting the next task for host managed-node3 42613 1727204583.86490: done getting next task for host managed-node3 42613 1727204583.86494: ^ task is: TASK: Delete dummy interface {{ interface }} 42613 1727204583.86497: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204583.86502: getting variables 42613 1727204583.86504: in VariableManager get_vars() 42613 1727204583.86551: Calling all_inventory to load vars for managed-node3 42613 1727204583.86555: Calling groups_inventory to load vars for managed-node3 42613 1727204583.86557: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204583.86976: Calling all_plugins_play to load vars for managed-node3 42613 1727204583.86980: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204583.86985: Calling groups_plugins_play to load vars for managed-node3 42613 1727204583.87294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204583.87618: done with get_vars() 42613 1727204583.87633: done getting variables 42613 1727204583.87744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204583.87887: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.046) 0:00:12.487 ***** 42613 1727204583.87912: entering _queue_task() for managed-node3/command 42613 1727204583.88172: worker is 1 (out of 1 available) 42613 1727204583.88188: exiting _queue_task() for managed-node3/command 42613 1727204583.88201: done queuing things up, now waiting for results queue to drain 42613 1727204583.88202: waiting for pending results... 42613 1727204583.88393: running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 42613 1727204583.88476: in run() - task 127b8e07-fff9-2f91-05d8-0000000001d4 42613 1727204583.88488: variable 'ansible_search_path' from source: unknown 42613 1727204583.88493: variable 'ansible_search_path' from source: unknown 42613 1727204583.88527: calling self._execute() 42613 1727204583.88772: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.88776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.88779: variable 'omit' from source: magic vars 42613 1727204583.89222: variable 'ansible_distribution_major_version' from source: facts 42613 1727204583.89249: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204583.89560: variable 'type' from source: set_fact 42613 1727204583.89573: variable 'state' from source: include params 42613 1727204583.89581: variable 'interface' from source: set_fact 42613 1727204583.89589: variable 'current_interfaces' from source: set_fact 42613 1727204583.89609: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 42613 1727204583.89635: when evaluation is False, skipping this task 42613 1727204583.89674: _execute() done 42613 1727204583.89688: dumping result to json 42613 1727204583.89702: done dumping result, returning 42613 1727204583.89715: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000001d4] 42613 1727204583.89758: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d4 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 42613 1727204583.90222: no more pending results, returning what we have 42613 1727204583.90226: results queue empty 42613 1727204583.90227: checking for any_errors_fatal 42613 1727204583.90234: done checking for any_errors_fatal 42613 1727204583.90235: checking for max_fail_percentage 42613 1727204583.90237: done checking for max_fail_percentage 42613 1727204583.90238: checking to see if all hosts have failed and the running result is not ok 42613 1727204583.90239: done checking to see if all hosts have failed 42613 1727204583.90240: getting the remaining hosts for this loop 42613 1727204583.90242: done getting the remaining hosts for this loop 42613 1727204583.90247: getting the next task for host managed-node3 42613 1727204583.90255: done getting next task for host managed-node3 42613 1727204583.90259: ^ task is: TASK: Create tap interface {{ interface }} 42613 1727204583.90263: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204583.90275: getting variables 42613 1727204583.90277: in VariableManager get_vars() 42613 1727204583.90321: Calling all_inventory to load vars for managed-node3 42613 1727204583.90325: Calling groups_inventory to load vars for managed-node3 42613 1727204583.90327: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204583.90349: Calling all_plugins_play to load vars for managed-node3 42613 1727204583.90353: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204583.90357: Calling groups_plugins_play to load vars for managed-node3 42613 1727204583.91271: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d4 42613 1727204583.91277: WORKER PROCESS EXITING 42613 1727204583.91381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204583.91783: done with get_vars() 42613 1727204583.91846: done getting variables 42613 1727204583.91942: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204583.92094: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.042) 0:00:12.529 ***** 42613 1727204583.92139: entering _queue_task() for managed-node3/command 42613 1727204583.92614: worker is 1 (out of 1 available) 42613 1727204583.92631: exiting _queue_task() for managed-node3/command 42613 1727204583.92644: done queuing things up, now waiting for results queue to drain 42613 1727204583.92646: waiting for pending results... 42613 1727204583.92941: running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 42613 1727204583.93083: in run() - task 127b8e07-fff9-2f91-05d8-0000000001d5 42613 1727204583.93115: variable 'ansible_search_path' from source: unknown 42613 1727204583.93129: variable 'ansible_search_path' from source: unknown 42613 1727204583.93179: calling self._execute() 42613 1727204583.93299: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.93316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.93341: variable 'omit' from source: magic vars 42613 1727204583.93968: variable 'ansible_distribution_major_version' from source: facts 42613 1727204583.94092: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204583.94276: variable 'type' from source: set_fact 42613 1727204583.94289: variable 'state' from source: include params 42613 1727204583.94300: variable 'interface' from source: set_fact 42613 1727204583.94321: variable 'current_interfaces' from source: set_fact 42613 1727204583.94343: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 42613 1727204583.94352: when evaluation is False, skipping this task 42613 1727204583.94360: _execute() done 42613 1727204583.94372: dumping result to json 42613 1727204583.94382: done dumping result, returning 42613 1727204583.94396: done running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000001d5] 42613 1727204583.94408: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d5 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 42613 1727204583.94722: no more pending results, returning what we have 42613 1727204583.94727: results queue empty 42613 1727204583.94728: checking for any_errors_fatal 42613 1727204583.94736: done checking for any_errors_fatal 42613 1727204583.94737: checking for max_fail_percentage 42613 1727204583.94740: done checking for max_fail_percentage 42613 1727204583.94741: checking to see if all hosts have failed and the running result is not ok 42613 1727204583.94742: done checking to see if all hosts have failed 42613 1727204583.94743: getting the remaining hosts for this loop 42613 1727204583.94745: done getting the remaining hosts for this loop 42613 1727204583.94751: getting the next task for host managed-node3 42613 1727204583.94759: done getting next task for host managed-node3 42613 1727204583.94773: ^ task is: TASK: Delete tap interface {{ interface }} 42613 1727204583.94778: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204583.94782: getting variables 42613 1727204583.94784: in VariableManager get_vars() 42613 1727204583.94826: Calling all_inventory to load vars for managed-node3 42613 1727204583.94829: Calling groups_inventory to load vars for managed-node3 42613 1727204583.94831: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204583.94996: Calling all_plugins_play to load vars for managed-node3 42613 1727204583.95000: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204583.95004: Calling groups_plugins_play to load vars for managed-node3 42613 1727204583.95619: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d5 42613 1727204583.95624: WORKER PROCESS EXITING 42613 1727204583.95649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204583.96022: done with get_vars() 42613 1727204583.96039: done getting variables 42613 1727204583.96213: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204583.96327: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.042) 0:00:12.571 ***** 42613 1727204583.96353: entering _queue_task() for managed-node3/command 42613 1727204583.96626: worker is 1 (out of 1 available) 42613 1727204583.96642: exiting _queue_task() for managed-node3/command 42613 1727204583.96656: done queuing things up, now waiting for results queue to drain 42613 1727204583.96658: waiting for pending results... 42613 1727204583.96891: running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 42613 1727204583.97018: in run() - task 127b8e07-fff9-2f91-05d8-0000000001d6 42613 1727204583.97049: variable 'ansible_search_path' from source: unknown 42613 1727204583.97057: variable 'ansible_search_path' from source: unknown 42613 1727204583.97108: calling self._execute() 42613 1727204583.97225: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204583.97237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204583.97361: variable 'omit' from source: magic vars 42613 1727204583.97728: variable 'ansible_distribution_major_version' from source: facts 42613 1727204583.97753: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204583.98070: variable 'type' from source: set_fact 42613 1727204583.98084: variable 'state' from source: include params 42613 1727204583.98093: variable 'interface' from source: set_fact 42613 1727204583.98102: variable 'current_interfaces' from source: set_fact 42613 1727204583.98134: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 42613 1727204583.98143: when evaluation is False, skipping this task 42613 1727204583.98150: _execute() done 42613 1727204583.98157: dumping result to json 42613 1727204583.98167: done dumping result, returning 42613 1727204583.98192: done running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000001d6] 42613 1727204583.98209: sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d6 42613 1727204583.98421: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000001d6 42613 1727204583.98424: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 42613 1727204583.98499: no more pending results, returning what we have 42613 1727204583.98503: results queue empty 42613 1727204583.98504: checking for any_errors_fatal 42613 1727204583.98512: done checking for any_errors_fatal 42613 1727204583.98513: checking for max_fail_percentage 42613 1727204583.98516: done checking for max_fail_percentage 42613 1727204583.98517: checking to see if all hosts have failed and the running result is not ok 42613 1727204583.98518: done checking to see if all hosts have failed 42613 1727204583.98519: getting the remaining hosts for this loop 42613 1727204583.98522: done getting the remaining hosts for this loop 42613 1727204583.98526: getting the next task for host managed-node3 42613 1727204583.98538: done getting next task for host managed-node3 42613 1727204583.98542: ^ task is: TASK: Include the task 'assert_device_present.yml' 42613 1727204583.98545: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204583.98550: getting variables 42613 1727204583.98552: in VariableManager get_vars() 42613 1727204583.98812: Calling all_inventory to load vars for managed-node3 42613 1727204583.98817: Calling groups_inventory to load vars for managed-node3 42613 1727204583.98819: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204583.98832: Calling all_plugins_play to load vars for managed-node3 42613 1727204583.98834: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204583.98838: Calling groups_plugins_play to load vars for managed-node3 42613 1727204583.99149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204583.99641: done with get_vars() 42613 1727204583.99656: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:20 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.034) 0:00:12.606 ***** 42613 1727204583.99807: entering _queue_task() for managed-node3/include_tasks 42613 1727204584.00330: worker is 1 (out of 1 available) 42613 1727204584.00343: exiting _queue_task() for managed-node3/include_tasks 42613 1727204584.00355: done queuing things up, now waiting for results queue to drain 42613 1727204584.00356: waiting for pending results... 42613 1727204584.00803: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 42613 1727204584.00818: in run() - task 127b8e07-fff9-2f91-05d8-00000000000e 42613 1727204584.00850: variable 'ansible_search_path' from source: unknown 42613 1727204584.00980: calling self._execute() 42613 1727204584.01078: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.01092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.01110: variable 'omit' from source: magic vars 42613 1727204584.02254: variable 'ansible_distribution_major_version' from source: facts 42613 1727204584.02269: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204584.02283: _execute() done 42613 1727204584.02287: dumping result to json 42613 1727204584.02289: done dumping result, returning 42613 1727204584.02315: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [127b8e07-fff9-2f91-05d8-00000000000e] 42613 1727204584.02319: sending task result for task 127b8e07-fff9-2f91-05d8-00000000000e 42613 1727204584.02606: no more pending results, returning what we have 42613 1727204584.02612: in VariableManager get_vars() 42613 1727204584.02664: Calling all_inventory to load vars for managed-node3 42613 1727204584.02669: Calling groups_inventory to load vars for managed-node3 42613 1727204584.02671: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204584.02686: Calling all_plugins_play to load vars for managed-node3 42613 1727204584.02688: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204584.02691: Calling groups_plugins_play to load vars for managed-node3 42613 1727204584.02986: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000000e 42613 1727204584.02990: WORKER PROCESS EXITING 42613 1727204584.03020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204584.03291: done with get_vars() 42613 1727204584.03303: variable 'ansible_search_path' from source: unknown 42613 1727204584.03318: we have included files to process 42613 1727204584.03319: generating all_blocks data 42613 1727204584.03321: done generating all_blocks data 42613 1727204584.03325: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 42613 1727204584.03327: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 42613 1727204584.03329: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 42613 1727204584.03464: in VariableManager get_vars() 42613 1727204584.03482: done with get_vars() 42613 1727204584.03572: done processing included file 42613 1727204584.03574: iterating over new_blocks loaded from include file 42613 1727204584.03575: in VariableManager get_vars() 42613 1727204584.03586: done with get_vars() 42613 1727204584.03587: filtering new block on tags 42613 1727204584.03601: done filtering new block on tags 42613 1727204584.03602: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 42613 1727204584.03606: extending task lists for all hosts with included blocks 42613 1727204584.04720: done extending task lists 42613 1727204584.04722: done processing included files 42613 1727204584.04722: results queue empty 42613 1727204584.04723: checking for any_errors_fatal 42613 1727204584.04725: done checking for any_errors_fatal 42613 1727204584.04726: checking for max_fail_percentage 42613 1727204584.04727: done checking for max_fail_percentage 42613 1727204584.04727: checking to see if all hosts have failed and the running result is not ok 42613 1727204584.04728: done checking to see if all hosts have failed 42613 1727204584.04728: getting the remaining hosts for this loop 42613 1727204584.04729: done getting the remaining hosts for this loop 42613 1727204584.04731: getting the next task for host managed-node3 42613 1727204584.04735: done getting next task for host managed-node3 42613 1727204584.04736: ^ task is: TASK: Include the task 'get_interface_stat.yml' 42613 1727204584.04738: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204584.04740: getting variables 42613 1727204584.04741: in VariableManager get_vars() 42613 1727204584.04753: Calling all_inventory to load vars for managed-node3 42613 1727204584.04755: Calling groups_inventory to load vars for managed-node3 42613 1727204584.04756: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204584.04762: Calling all_plugins_play to load vars for managed-node3 42613 1727204584.04764: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204584.04767: Calling groups_plugins_play to load vars for managed-node3 42613 1727204584.04879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204584.05142: done with get_vars() 42613 1727204584.05281: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:03:04 -0400 (0:00:00.056) 0:00:12.662 ***** 42613 1727204584.05485: entering _queue_task() for managed-node3/include_tasks 42613 1727204584.06056: worker is 1 (out of 1 available) 42613 1727204584.06073: exiting _queue_task() for managed-node3/include_tasks 42613 1727204584.06087: done queuing things up, now waiting for results queue to drain 42613 1727204584.06089: waiting for pending results... 42613 1727204584.06327: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 42613 1727204584.06410: in run() - task 127b8e07-fff9-2f91-05d8-0000000002ec 42613 1727204584.06571: variable 'ansible_search_path' from source: unknown 42613 1727204584.06574: variable 'ansible_search_path' from source: unknown 42613 1727204584.06577: calling self._execute() 42613 1727204584.06580: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.06593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.06605: variable 'omit' from source: magic vars 42613 1727204584.07004: variable 'ansible_distribution_major_version' from source: facts 42613 1727204584.07026: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204584.07039: _execute() done 42613 1727204584.07053: dumping result to json 42613 1727204584.07056: done dumping result, returning 42613 1727204584.07062: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-2f91-05d8-0000000002ec] 42613 1727204584.07068: sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ec 42613 1727204584.07201: no more pending results, returning what we have 42613 1727204584.07207: in VariableManager get_vars() 42613 1727204584.07257: Calling all_inventory to load vars for managed-node3 42613 1727204584.07261: Calling groups_inventory to load vars for managed-node3 42613 1727204584.07262: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204584.07270: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ec 42613 1727204584.07273: WORKER PROCESS EXITING 42613 1727204584.07288: Calling all_plugins_play to load vars for managed-node3 42613 1727204584.07291: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204584.07294: Calling groups_plugins_play to load vars for managed-node3 42613 1727204584.07474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204584.07614: done with get_vars() 42613 1727204584.07621: variable 'ansible_search_path' from source: unknown 42613 1727204584.07622: variable 'ansible_search_path' from source: unknown 42613 1727204584.07652: we have included files to process 42613 1727204584.07652: generating all_blocks data 42613 1727204584.07654: done generating all_blocks data 42613 1727204584.07654: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 42613 1727204584.07655: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 42613 1727204584.07657: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 42613 1727204584.07839: done processing included file 42613 1727204584.07841: iterating over new_blocks loaded from include file 42613 1727204584.07842: in VariableManager get_vars() 42613 1727204584.07854: done with get_vars() 42613 1727204584.07855: filtering new block on tags 42613 1727204584.07868: done filtering new block on tags 42613 1727204584.07870: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 42613 1727204584.07874: extending task lists for all hosts with included blocks 42613 1727204584.07944: done extending task lists 42613 1727204584.07945: done processing included files 42613 1727204584.07946: results queue empty 42613 1727204584.07946: checking for any_errors_fatal 42613 1727204584.07949: done checking for any_errors_fatal 42613 1727204584.07950: checking for max_fail_percentage 42613 1727204584.07950: done checking for max_fail_percentage 42613 1727204584.07951: checking to see if all hosts have failed and the running result is not ok 42613 1727204584.07952: done checking to see if all hosts have failed 42613 1727204584.07952: getting the remaining hosts for this loop 42613 1727204584.07953: done getting the remaining hosts for this loop 42613 1727204584.07955: getting the next task for host managed-node3 42613 1727204584.07958: done getting next task for host managed-node3 42613 1727204584.07959: ^ task is: TASK: Get stat for interface {{ interface }} 42613 1727204584.07961: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204584.07963: getting variables 42613 1727204584.07964: in VariableManager get_vars() 42613 1727204584.07974: Calling all_inventory to load vars for managed-node3 42613 1727204584.07976: Calling groups_inventory to load vars for managed-node3 42613 1727204584.07977: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204584.07981: Calling all_plugins_play to load vars for managed-node3 42613 1727204584.07983: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204584.07985: Calling groups_plugins_play to load vars for managed-node3 42613 1727204584.08115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204584.08251: done with get_vars() 42613 1727204584.08259: done getting variables 42613 1727204584.08388: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:04 -0400 (0:00:00.029) 0:00:12.692 ***** 42613 1727204584.08412: entering _queue_task() for managed-node3/stat 42613 1727204584.08680: worker is 1 (out of 1 available) 42613 1727204584.08694: exiting _queue_task() for managed-node3/stat 42613 1727204584.08706: done queuing things up, now waiting for results queue to drain 42613 1727204584.08708: waiting for pending results... 42613 1727204584.08894: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 42613 1727204584.08979: in run() - task 127b8e07-fff9-2f91-05d8-0000000003b5 42613 1727204584.08991: variable 'ansible_search_path' from source: unknown 42613 1727204584.08994: variable 'ansible_search_path' from source: unknown 42613 1727204584.09032: calling self._execute() 42613 1727204584.09104: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.09108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.09120: variable 'omit' from source: magic vars 42613 1727204584.09421: variable 'ansible_distribution_major_version' from source: facts 42613 1727204584.09430: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204584.09440: variable 'omit' from source: magic vars 42613 1727204584.09480: variable 'omit' from source: magic vars 42613 1727204584.09556: variable 'interface' from source: set_fact 42613 1727204584.09579: variable 'omit' from source: magic vars 42613 1727204584.09618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204584.09649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204584.09670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204584.09686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204584.09698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204584.09724: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204584.09727: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.09730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.09811: Set connection var ansible_shell_executable to /bin/sh 42613 1727204584.09814: Set connection var ansible_pipelining to False 42613 1727204584.09823: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204584.09825: Set connection var ansible_connection to ssh 42613 1727204584.09830: Set connection var ansible_timeout to 10 42613 1727204584.09835: Set connection var ansible_shell_type to sh 42613 1727204584.09853: variable 'ansible_shell_executable' from source: unknown 42613 1727204584.09856: variable 'ansible_connection' from source: unknown 42613 1727204584.09859: variable 'ansible_module_compression' from source: unknown 42613 1727204584.09862: variable 'ansible_shell_type' from source: unknown 42613 1727204584.09866: variable 'ansible_shell_executable' from source: unknown 42613 1727204584.09870: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.09873: variable 'ansible_pipelining' from source: unknown 42613 1727204584.09875: variable 'ansible_timeout' from source: unknown 42613 1727204584.09879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.10046: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204584.10055: variable 'omit' from source: magic vars 42613 1727204584.10060: starting attempt loop 42613 1727204584.10063: running the handler 42613 1727204584.10077: _low_level_execute_command(): starting 42613 1727204584.10083: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204584.10675: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.10679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204584.10684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.10735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204584.10739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.10741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.10827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.12719: stdout chunk (state=3): >>>/root <<< 42613 1727204584.12835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.12904: stderr chunk (state=3): >>><<< 42613 1727204584.12908: stdout chunk (state=3): >>><<< 42613 1727204584.12929: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204584.12944: _low_level_execute_command(): starting 42613 1727204584.12951: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515 `" && echo ansible-tmp-1727204584.1292977-43746-58504445529515="` echo /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515 `" ) && sleep 0' 42613 1727204584.13573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.13577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204584.13588: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204584.13593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.13617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.13638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.13744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.15846: stdout chunk (state=3): >>>ansible-tmp-1727204584.1292977-43746-58504445529515=/root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515 <<< 42613 1727204584.15997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.16029: stderr chunk (state=3): >>><<< 42613 1727204584.16036: stdout chunk (state=3): >>><<< 42613 1727204584.16051: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204584.1292977-43746-58504445529515=/root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204584.16097: variable 'ansible_module_compression' from source: unknown 42613 1727204584.16146: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 42613 1727204584.16179: variable 'ansible_facts' from source: unknown 42613 1727204584.16246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/AnsiballZ_stat.py 42613 1727204584.16362: Sending initial data 42613 1727204584.16369: Sent initial data (152 bytes) 42613 1727204584.16858: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.16862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.16867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.16869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.16924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204584.16928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.16930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.17011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.18804: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204584.18855: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204584.18945: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmps6rllcpb /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/AnsiballZ_stat.py <<< 42613 1727204584.18949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/AnsiballZ_stat.py" <<< 42613 1727204584.19002: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmps6rllcpb" to remote "/root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/AnsiballZ_stat.py" <<< 42613 1727204584.19900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.19997: stderr chunk (state=3): >>><<< 42613 1727204584.20012: stdout chunk (state=3): >>><<< 42613 1727204584.20152: done transferring module to remote 42613 1727204584.20155: _low_level_execute_command(): starting 42613 1727204584.20158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/ /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/AnsiballZ_stat.py && sleep 0' 42613 1727204584.20826: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204584.20846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204584.20942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.20989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204584.21015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.21074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.21154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.23236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.23261: stdout chunk (state=3): >>><<< 42613 1727204584.23264: stderr chunk (state=3): >>><<< 42613 1727204584.23367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204584.23371: _low_level_execute_command(): starting 42613 1727204584.23374: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/AnsiballZ_stat.py && sleep 0' 42613 1727204584.24030: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.24035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.24112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.41712: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 41392, "dev": 23, "nlink": 1, "atime": 1727204582.3072946, "mtime": 1727204582.3072946, "ctime": 1727204582.3072946, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 42613 1727204584.43277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204584.43345: stderr chunk (state=3): >>><<< 42613 1727204584.43349: stdout chunk (state=3): >>><<< 42613 1727204584.43474: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 41392, "dev": 23, "nlink": 1, "atime": 1727204582.3072946, "mtime": 1727204582.3072946, "ctime": 1727204582.3072946, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204584.43479: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204584.43481: _low_level_execute_command(): starting 42613 1727204584.43484: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204584.1292977-43746-58504445529515/ > /dev/null 2>&1 && sleep 0' 42613 1727204584.44085: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.44107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.44161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204584.44181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.44255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.46321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.46402: stderr chunk (state=3): >>><<< 42613 1727204584.46406: stdout chunk (state=3): >>><<< 42613 1727204584.46571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204584.46575: handler run complete 42613 1727204584.46577: attempt loop complete, returning result 42613 1727204584.46579: _execute() done 42613 1727204584.46581: dumping result to json 42613 1727204584.46583: done dumping result, returning 42613 1727204584.46585: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000003b5] 42613 1727204584.46587: sending task result for task 127b8e07-fff9-2f91-05d8-0000000003b5 42613 1727204584.46669: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000003b5 42613 1727204584.46672: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204582.3072946, "block_size": 4096, "blocks": 0, "ctime": 1727204582.3072946, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 41392, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204582.3072946, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 42613 1727204584.46779: no more pending results, returning what we have 42613 1727204584.46784: results queue empty 42613 1727204584.46785: checking for any_errors_fatal 42613 1727204584.46786: done checking for any_errors_fatal 42613 1727204584.46787: checking for max_fail_percentage 42613 1727204584.46790: done checking for max_fail_percentage 42613 1727204584.46790: checking to see if all hosts have failed and the running result is not ok 42613 1727204584.46791: done checking to see if all hosts have failed 42613 1727204584.46792: getting the remaining hosts for this loop 42613 1727204584.46794: done getting the remaining hosts for this loop 42613 1727204584.46798: getting the next task for host managed-node3 42613 1727204584.46807: done getting next task for host managed-node3 42613 1727204584.46811: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 42613 1727204584.46814: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204584.46819: getting variables 42613 1727204584.46821: in VariableManager get_vars() 42613 1727204584.46861: Calling all_inventory to load vars for managed-node3 42613 1727204584.46863: Calling groups_inventory to load vars for managed-node3 42613 1727204584.46958: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204584.46976: Calling all_plugins_play to load vars for managed-node3 42613 1727204584.46979: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204584.46983: Calling groups_plugins_play to load vars for managed-node3 42613 1727204584.47185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204584.47364: done with get_vars() 42613 1727204584.47375: done getting variables 42613 1727204584.47456: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 42613 1727204584.47553: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:03:04 -0400 (0:00:00.391) 0:00:13.083 ***** 42613 1727204584.47580: entering _queue_task() for managed-node3/assert 42613 1727204584.47582: Creating lock for assert 42613 1727204584.47839: worker is 1 (out of 1 available) 42613 1727204584.47853: exiting _queue_task() for managed-node3/assert 42613 1727204584.47868: done queuing things up, now waiting for results queue to drain 42613 1727204584.47870: waiting for pending results... 42613 1727204584.48063: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' 42613 1727204584.48135: in run() - task 127b8e07-fff9-2f91-05d8-0000000002ed 42613 1727204584.48149: variable 'ansible_search_path' from source: unknown 42613 1727204584.48152: variable 'ansible_search_path' from source: unknown 42613 1727204584.48185: calling self._execute() 42613 1727204584.48258: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.48263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.48274: variable 'omit' from source: magic vars 42613 1727204584.48573: variable 'ansible_distribution_major_version' from source: facts 42613 1727204584.48583: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204584.48589: variable 'omit' from source: magic vars 42613 1727204584.48618: variable 'omit' from source: magic vars 42613 1727204584.48697: variable 'interface' from source: set_fact 42613 1727204584.48713: variable 'omit' from source: magic vars 42613 1727204584.48752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204584.48786: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204584.48803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204584.48818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204584.48828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204584.48860: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204584.48864: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.48868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.48946: Set connection var ansible_shell_executable to /bin/sh 42613 1727204584.48950: Set connection var ansible_pipelining to False 42613 1727204584.48957: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204584.48962: Set connection var ansible_connection to ssh 42613 1727204584.48965: Set connection var ansible_timeout to 10 42613 1727204584.48976: Set connection var ansible_shell_type to sh 42613 1727204584.48991: variable 'ansible_shell_executable' from source: unknown 42613 1727204584.48994: variable 'ansible_connection' from source: unknown 42613 1727204584.48997: variable 'ansible_module_compression' from source: unknown 42613 1727204584.48999: variable 'ansible_shell_type' from source: unknown 42613 1727204584.49001: variable 'ansible_shell_executable' from source: unknown 42613 1727204584.49004: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.49008: variable 'ansible_pipelining' from source: unknown 42613 1727204584.49012: variable 'ansible_timeout' from source: unknown 42613 1727204584.49016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.49133: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204584.49145: variable 'omit' from source: magic vars 42613 1727204584.49150: starting attempt loop 42613 1727204584.49154: running the handler 42613 1727204584.49264: variable 'interface_stat' from source: set_fact 42613 1727204584.49283: Evaluated conditional (interface_stat.stat.exists): True 42613 1727204584.49289: handler run complete 42613 1727204584.49305: attempt loop complete, returning result 42613 1727204584.49309: _execute() done 42613 1727204584.49311: dumping result to json 42613 1727204584.49314: done dumping result, returning 42613 1727204584.49321: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' [127b8e07-fff9-2f91-05d8-0000000002ed] 42613 1727204584.49326: sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ed 42613 1727204584.49427: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000002ed 42613 1727204584.49430: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204584.49486: no more pending results, returning what we have 42613 1727204584.49490: results queue empty 42613 1727204584.49492: checking for any_errors_fatal 42613 1727204584.49500: done checking for any_errors_fatal 42613 1727204584.49501: checking for max_fail_percentage 42613 1727204584.49503: done checking for max_fail_percentage 42613 1727204584.49504: checking to see if all hosts have failed and the running result is not ok 42613 1727204584.49505: done checking to see if all hosts have failed 42613 1727204584.49506: getting the remaining hosts for this loop 42613 1727204584.49507: done getting the remaining hosts for this loop 42613 1727204584.49512: getting the next task for host managed-node3 42613 1727204584.49520: done getting next task for host managed-node3 42613 1727204584.49523: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 42613 1727204584.49525: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204584.49529: getting variables 42613 1727204584.49530: in VariableManager get_vars() 42613 1727204584.49577: Calling all_inventory to load vars for managed-node3 42613 1727204584.49580: Calling groups_inventory to load vars for managed-node3 42613 1727204584.49582: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204584.49592: Calling all_plugins_play to load vars for managed-node3 42613 1727204584.49595: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204584.49598: Calling groups_plugins_play to load vars for managed-node3 42613 1727204584.49752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204584.49895: done with get_vars() 42613 1727204584.49906: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:23 Tuesday 24 September 2024 15:03:04 -0400 (0:00:00.023) 0:00:13.107 ***** 42613 1727204584.49981: entering _queue_task() for managed-node3/lineinfile 42613 1727204584.49982: Creating lock for lineinfile 42613 1727204584.50234: worker is 1 (out of 1 available) 42613 1727204584.50249: exiting _queue_task() for managed-node3/lineinfile 42613 1727204584.50261: done queuing things up, now waiting for results queue to drain 42613 1727204584.50262: waiting for pending results... 42613 1727204584.50451: running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 42613 1727204584.50516: in run() - task 127b8e07-fff9-2f91-05d8-00000000000f 42613 1727204584.50529: variable 'ansible_search_path' from source: unknown 42613 1727204584.50564: calling self._execute() 42613 1727204584.50636: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.50643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.50652: variable 'omit' from source: magic vars 42613 1727204584.50951: variable 'ansible_distribution_major_version' from source: facts 42613 1727204584.50963: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204584.50970: variable 'omit' from source: magic vars 42613 1727204584.50986: variable 'omit' from source: magic vars 42613 1727204584.51013: variable 'omit' from source: magic vars 42613 1727204584.51053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204584.51084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204584.51102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204584.51116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204584.51127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204584.51156: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204584.51160: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.51163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.51242: Set connection var ansible_shell_executable to /bin/sh 42613 1727204584.51248: Set connection var ansible_pipelining to False 42613 1727204584.51269: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204584.51273: Set connection var ansible_connection to ssh 42613 1727204584.51276: Set connection var ansible_timeout to 10 42613 1727204584.51278: Set connection var ansible_shell_type to sh 42613 1727204584.51294: variable 'ansible_shell_executable' from source: unknown 42613 1727204584.51297: variable 'ansible_connection' from source: unknown 42613 1727204584.51300: variable 'ansible_module_compression' from source: unknown 42613 1727204584.51303: variable 'ansible_shell_type' from source: unknown 42613 1727204584.51305: variable 'ansible_shell_executable' from source: unknown 42613 1727204584.51308: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204584.51310: variable 'ansible_pipelining' from source: unknown 42613 1727204584.51314: variable 'ansible_timeout' from source: unknown 42613 1727204584.51319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204584.51485: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204584.51494: variable 'omit' from source: magic vars 42613 1727204584.51504: starting attempt loop 42613 1727204584.51508: running the handler 42613 1727204584.51529: _low_level_execute_command(): starting 42613 1727204584.51532: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204584.52130: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.52139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.52143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.52191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204584.52198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.52200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.52274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.54091: stdout chunk (state=3): >>>/root <<< 42613 1727204584.54275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.54289: stderr chunk (state=3): >>><<< 42613 1727204584.54292: stdout chunk (state=3): >>><<< 42613 1727204584.54316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204584.54329: _low_level_execute_command(): starting 42613 1727204584.54338: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316 `" && echo ansible-tmp-1727204584.5431547-43846-151337472840316="` echo /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316 `" ) && sleep 0' 42613 1727204584.54835: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.54839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.54842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.54852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.54900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204584.54904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.54906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.54990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.57151: stdout chunk (state=3): >>>ansible-tmp-1727204584.5431547-43846-151337472840316=/root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316 <<< 42613 1727204584.57264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.57323: stderr chunk (state=3): >>><<< 42613 1727204584.57327: stdout chunk (state=3): >>><<< 42613 1727204584.57350: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204584.5431547-43846-151337472840316=/root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204584.57396: variable 'ansible_module_compression' from source: unknown 42613 1727204584.57436: ANSIBALLZ: Using lock for lineinfile 42613 1727204584.57439: ANSIBALLZ: Acquiring lock 42613 1727204584.57446: ANSIBALLZ: Lock acquired: 139982753230544 42613 1727204584.57448: ANSIBALLZ: Creating module 42613 1727204584.70381: ANSIBALLZ: Writing module into payload 42613 1727204584.70500: ANSIBALLZ: Writing module 42613 1727204584.70544: ANSIBALLZ: Renaming module 42613 1727204584.70564: ANSIBALLZ: Done creating module 42613 1727204584.70591: variable 'ansible_facts' from source: unknown 42613 1727204584.70661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/AnsiballZ_lineinfile.py 42613 1727204584.70918: Sending initial data 42613 1727204584.70929: Sent initial data (159 bytes) 42613 1727204584.71844: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.71881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.71996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.73861: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204584.73973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204584.74010: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpy2if7vde /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/AnsiballZ_lineinfile.py <<< 42613 1727204584.74014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/AnsiballZ_lineinfile.py" <<< 42613 1727204584.74098: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpy2if7vde" to remote "/root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/AnsiballZ_lineinfile.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/AnsiballZ_lineinfile.py" <<< 42613 1727204584.75792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.75805: stdout chunk (state=3): >>><<< 42613 1727204584.75817: stderr chunk (state=3): >>><<< 42613 1727204584.76072: done transferring module to remote 42613 1727204584.76076: _low_level_execute_command(): starting 42613 1727204584.76079: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/ /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/AnsiballZ_lineinfile.py && sleep 0' 42613 1727204584.77302: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204584.77421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.77482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.77591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204584.79592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204584.79681: stderr chunk (state=3): >>><<< 42613 1727204584.79691: stdout chunk (state=3): >>><<< 42613 1727204584.79716: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204584.79727: _low_level_execute_command(): starting 42613 1727204584.79741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/AnsiballZ_lineinfile.py && sleep 0' 42613 1727204584.81072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204584.81093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204584.81110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204584.81130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204584.81152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204584.81164: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204584.81180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204584.81198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204584.81210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204584.81222: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204584.81317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204584.81345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204584.81470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204585.00323: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 42613 1727204585.01674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204585.01678: stdout chunk (state=3): >>><<< 42613 1727204585.01680: stderr chunk (state=3): >>><<< 42613 1727204585.01683: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204585.01686: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204585.01758: _low_level_execute_command(): starting 42613 1727204585.01961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204584.5431547-43846-151337472840316/ > /dev/null 2>&1 && sleep 0' 42613 1727204585.03092: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204585.03326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204585.03346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204585.03375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204585.03496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204585.05601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204585.05616: stdout chunk (state=3): >>><<< 42613 1727204585.05631: stderr chunk (state=3): >>><<< 42613 1727204585.05655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204585.05677: handler run complete 42613 1727204585.05711: attempt loop complete, returning result 42613 1727204585.05727: _execute() done 42613 1727204585.05733: dumping result to json 42613 1727204585.05743: done dumping result, returning 42613 1727204585.05756: done running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [127b8e07-fff9-2f91-05d8-00000000000f] 42613 1727204585.05773: sending task result for task 127b8e07-fff9-2f91-05d8-00000000000f changed: [managed-node3] => { "backup": "", "changed": true } MSG: line added 42613 1727204585.06021: no more pending results, returning what we have 42613 1727204585.06024: results queue empty 42613 1727204585.06025: checking for any_errors_fatal 42613 1727204585.06032: done checking for any_errors_fatal 42613 1727204585.06033: checking for max_fail_percentage 42613 1727204585.06036: done checking for max_fail_percentage 42613 1727204585.06037: checking to see if all hosts have failed and the running result is not ok 42613 1727204585.06037: done checking to see if all hosts have failed 42613 1727204585.06038: getting the remaining hosts for this loop 42613 1727204585.06040: done getting the remaining hosts for this loop 42613 1727204585.06045: getting the next task for host managed-node3 42613 1727204585.06053: done getting next task for host managed-node3 42613 1727204585.06059: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 42613 1727204585.06062: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204585.06187: getting variables 42613 1727204585.06189: in VariableManager get_vars() 42613 1727204585.06239: Calling all_inventory to load vars for managed-node3 42613 1727204585.06243: Calling groups_inventory to load vars for managed-node3 42613 1727204585.06245: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204585.06260: Calling all_plugins_play to load vars for managed-node3 42613 1727204585.06263: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204585.06269: Calling groups_plugins_play to load vars for managed-node3 42613 1727204585.06937: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000000f 42613 1727204585.06942: WORKER PROCESS EXITING 42613 1727204585.06979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204585.07237: done with get_vars() 42613 1727204585.07251: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.573) 0:00:13.681 ***** 42613 1727204585.07365: entering _queue_task() for managed-node3/include_tasks 42613 1727204585.07851: worker is 1 (out of 1 available) 42613 1727204585.07862: exiting _queue_task() for managed-node3/include_tasks 42613 1727204585.07897: done queuing things up, now waiting for results queue to drain 42613 1727204585.07899: waiting for pending results... 42613 1727204585.08114: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 42613 1727204585.08297: in run() - task 127b8e07-fff9-2f91-05d8-000000000017 42613 1727204585.08320: variable 'ansible_search_path' from source: unknown 42613 1727204585.08327: variable 'ansible_search_path' from source: unknown 42613 1727204585.08379: calling self._execute() 42613 1727204585.08489: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204585.08502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204585.08515: variable 'omit' from source: magic vars 42613 1727204585.08971: variable 'ansible_distribution_major_version' from source: facts 42613 1727204585.08990: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204585.09000: _execute() done 42613 1727204585.09006: dumping result to json 42613 1727204585.09013: done dumping result, returning 42613 1727204585.09028: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-2f91-05d8-000000000017] 42613 1727204585.09040: sending task result for task 127b8e07-fff9-2f91-05d8-000000000017 42613 1727204585.09312: no more pending results, returning what we have 42613 1727204585.09317: in VariableManager get_vars() 42613 1727204585.09373: Calling all_inventory to load vars for managed-node3 42613 1727204585.09376: Calling groups_inventory to load vars for managed-node3 42613 1727204585.09379: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204585.09394: Calling all_plugins_play to load vars for managed-node3 42613 1727204585.09398: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204585.09401: Calling groups_plugins_play to load vars for managed-node3 42613 1727204585.09754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204585.10147: done with get_vars() 42613 1727204585.10157: variable 'ansible_search_path' from source: unknown 42613 1727204585.10158: variable 'ansible_search_path' from source: unknown 42613 1727204585.10184: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000017 42613 1727204585.10188: WORKER PROCESS EXITING 42613 1727204585.10250: we have included files to process 42613 1727204585.10251: generating all_blocks data 42613 1727204585.10253: done generating all_blocks data 42613 1727204585.10259: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204585.10260: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204585.10262: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204585.11152: done processing included file 42613 1727204585.11155: iterating over new_blocks loaded from include file 42613 1727204585.11156: in VariableManager get_vars() 42613 1727204585.11188: done with get_vars() 42613 1727204585.11190: filtering new block on tags 42613 1727204585.11209: done filtering new block on tags 42613 1727204585.11212: in VariableManager get_vars() 42613 1727204585.11234: done with get_vars() 42613 1727204585.11236: filtering new block on tags 42613 1727204585.11258: done filtering new block on tags 42613 1727204585.11261: in VariableManager get_vars() 42613 1727204585.11283: done with get_vars() 42613 1727204585.11289: filtering new block on tags 42613 1727204585.11309: done filtering new block on tags 42613 1727204585.11311: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 42613 1727204585.11317: extending task lists for all hosts with included blocks 42613 1727204585.13073: done extending task lists 42613 1727204585.13078: done processing included files 42613 1727204585.13079: results queue empty 42613 1727204585.13080: checking for any_errors_fatal 42613 1727204585.13086: done checking for any_errors_fatal 42613 1727204585.13086: checking for max_fail_percentage 42613 1727204585.13088: done checking for max_fail_percentage 42613 1727204585.13089: checking to see if all hosts have failed and the running result is not ok 42613 1727204585.13089: done checking to see if all hosts have failed 42613 1727204585.13090: getting the remaining hosts for this loop 42613 1727204585.13092: done getting the remaining hosts for this loop 42613 1727204585.13095: getting the next task for host managed-node3 42613 1727204585.13100: done getting next task for host managed-node3 42613 1727204585.13103: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 42613 1727204585.13107: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204585.13119: getting variables 42613 1727204585.13120: in VariableManager get_vars() 42613 1727204585.13140: Calling all_inventory to load vars for managed-node3 42613 1727204585.13143: Calling groups_inventory to load vars for managed-node3 42613 1727204585.13145: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204585.13151: Calling all_plugins_play to load vars for managed-node3 42613 1727204585.13154: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204585.13157: Calling groups_plugins_play to load vars for managed-node3 42613 1727204585.13478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204585.13815: done with get_vars() 42613 1727204585.13831: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.066) 0:00:13.748 ***** 42613 1727204585.14011: entering _queue_task() for managed-node3/setup 42613 1727204585.14505: worker is 1 (out of 1 available) 42613 1727204585.14518: exiting _queue_task() for managed-node3/setup 42613 1727204585.14531: done queuing things up, now waiting for results queue to drain 42613 1727204585.14532: waiting for pending results... 42613 1727204585.14947: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 42613 1727204585.15302: in run() - task 127b8e07-fff9-2f91-05d8-0000000003d0 42613 1727204585.15324: variable 'ansible_search_path' from source: unknown 42613 1727204585.15338: variable 'ansible_search_path' from source: unknown 42613 1727204585.15390: calling self._execute() 42613 1727204585.15497: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204585.15510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204585.15525: variable 'omit' from source: magic vars 42613 1727204585.16149: variable 'ansible_distribution_major_version' from source: facts 42613 1727204585.16173: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204585.16474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204585.20374: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204585.20379: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204585.20444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204585.20492: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204585.20532: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204585.20696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204585.20850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204585.20854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204585.20857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204585.20860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204585.20965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204585.21001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204585.21033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204585.21089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204585.21110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204585.21378: variable '__network_required_facts' from source: role '' defaults 42613 1727204585.21401: variable 'ansible_facts' from source: unknown 42613 1727204585.21509: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 42613 1727204585.21518: when evaluation is False, skipping this task 42613 1727204585.21525: _execute() done 42613 1727204585.21531: dumping result to json 42613 1727204585.21539: done dumping result, returning 42613 1727204585.21550: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-2f91-05d8-0000000003d0] 42613 1727204585.21561: sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d0 42613 1727204585.21877: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d0 42613 1727204585.21881: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204585.21929: no more pending results, returning what we have 42613 1727204585.21933: results queue empty 42613 1727204585.21934: checking for any_errors_fatal 42613 1727204585.21936: done checking for any_errors_fatal 42613 1727204585.21937: checking for max_fail_percentage 42613 1727204585.21939: done checking for max_fail_percentage 42613 1727204585.21939: checking to see if all hosts have failed and the running result is not ok 42613 1727204585.21940: done checking to see if all hosts have failed 42613 1727204585.21941: getting the remaining hosts for this loop 42613 1727204585.21943: done getting the remaining hosts for this loop 42613 1727204585.21948: getting the next task for host managed-node3 42613 1727204585.21958: done getting next task for host managed-node3 42613 1727204585.21963: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 42613 1727204585.21968: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204585.21984: getting variables 42613 1727204585.21986: in VariableManager get_vars() 42613 1727204585.22030: Calling all_inventory to load vars for managed-node3 42613 1727204585.22033: Calling groups_inventory to load vars for managed-node3 42613 1727204585.22036: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204585.22047: Calling all_plugins_play to load vars for managed-node3 42613 1727204585.22051: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204585.22054: Calling groups_plugins_play to load vars for managed-node3 42613 1727204585.22652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204585.23040: done with get_vars() 42613 1727204585.23055: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.091) 0:00:13.839 ***** 42613 1727204585.23220: entering _queue_task() for managed-node3/stat 42613 1727204585.23853: worker is 1 (out of 1 available) 42613 1727204585.23868: exiting _queue_task() for managed-node3/stat 42613 1727204585.23880: done queuing things up, now waiting for results queue to drain 42613 1727204585.23882: waiting for pending results... 42613 1727204585.24192: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 42613 1727204585.24317: in run() - task 127b8e07-fff9-2f91-05d8-0000000003d2 42613 1727204585.24337: variable 'ansible_search_path' from source: unknown 42613 1727204585.24345: variable 'ansible_search_path' from source: unknown 42613 1727204585.24393: calling self._execute() 42613 1727204585.24500: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204585.24516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204585.24530: variable 'omit' from source: magic vars 42613 1727204585.24941: variable 'ansible_distribution_major_version' from source: facts 42613 1727204585.25048: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204585.25144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204585.25450: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204585.25510: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204585.25549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204585.25596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204585.25692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204585.25728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204585.25759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204585.25792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204585.25897: variable '__network_is_ostree' from source: set_fact 42613 1727204585.25928: Evaluated conditional (not __network_is_ostree is defined): False 42613 1727204585.25931: when evaluation is False, skipping this task 42613 1727204585.25934: _execute() done 42613 1727204585.25936: dumping result to json 42613 1727204585.25938: done dumping result, returning 42613 1727204585.26038: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-2f91-05d8-0000000003d2] 42613 1727204585.26042: sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d2 42613 1727204585.26127: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d2 42613 1727204585.26130: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 42613 1727204585.26197: no more pending results, returning what we have 42613 1727204585.26201: results queue empty 42613 1727204585.26202: checking for any_errors_fatal 42613 1727204585.26209: done checking for any_errors_fatal 42613 1727204585.26210: checking for max_fail_percentage 42613 1727204585.26212: done checking for max_fail_percentage 42613 1727204585.26213: checking to see if all hosts have failed and the running result is not ok 42613 1727204585.26214: done checking to see if all hosts have failed 42613 1727204585.26215: getting the remaining hosts for this loop 42613 1727204585.26216: done getting the remaining hosts for this loop 42613 1727204585.26221: getting the next task for host managed-node3 42613 1727204585.26229: done getting next task for host managed-node3 42613 1727204585.26234: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 42613 1727204585.26239: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204585.26256: getting variables 42613 1727204585.26258: in VariableManager get_vars() 42613 1727204585.26303: Calling all_inventory to load vars for managed-node3 42613 1727204585.26306: Calling groups_inventory to load vars for managed-node3 42613 1727204585.26308: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204585.26320: Calling all_plugins_play to load vars for managed-node3 42613 1727204585.26323: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204585.26326: Calling groups_plugins_play to load vars for managed-node3 42613 1727204585.26777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204585.26988: done with get_vars() 42613 1727204585.27004: done getting variables 42613 1727204585.27074: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.039) 0:00:13.879 ***** 42613 1727204585.27116: entering _queue_task() for managed-node3/set_fact 42613 1727204585.27702: worker is 1 (out of 1 available) 42613 1727204585.27715: exiting _queue_task() for managed-node3/set_fact 42613 1727204585.27727: done queuing things up, now waiting for results queue to drain 42613 1727204585.27728: waiting for pending results... 42613 1727204585.27860: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 42613 1727204585.28010: in run() - task 127b8e07-fff9-2f91-05d8-0000000003d3 42613 1727204585.28065: variable 'ansible_search_path' from source: unknown 42613 1727204585.28072: variable 'ansible_search_path' from source: unknown 42613 1727204585.28096: calling self._execute() 42613 1727204585.28192: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204585.28205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204585.28284: variable 'omit' from source: magic vars 42613 1727204585.28638: variable 'ansible_distribution_major_version' from source: facts 42613 1727204585.28658: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204585.28856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204585.29596: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204585.29655: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204585.29703: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204585.29743: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204585.29843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204585.29919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204585.29923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204585.29944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204585.30054: variable '__network_is_ostree' from source: set_fact 42613 1727204585.30071: Evaluated conditional (not __network_is_ostree is defined): False 42613 1727204585.30079: when evaluation is False, skipping this task 42613 1727204585.30087: _execute() done 42613 1727204585.30094: dumping result to json 42613 1727204585.30136: done dumping result, returning 42613 1727204585.30140: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-2f91-05d8-0000000003d3] 42613 1727204585.30143: sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d3 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 42613 1727204585.30294: no more pending results, returning what we have 42613 1727204585.30298: results queue empty 42613 1727204585.30299: checking for any_errors_fatal 42613 1727204585.30304: done checking for any_errors_fatal 42613 1727204585.30305: checking for max_fail_percentage 42613 1727204585.30307: done checking for max_fail_percentage 42613 1727204585.30308: checking to see if all hosts have failed and the running result is not ok 42613 1727204585.30309: done checking to see if all hosts have failed 42613 1727204585.30310: getting the remaining hosts for this loop 42613 1727204585.30312: done getting the remaining hosts for this loop 42613 1727204585.30317: getting the next task for host managed-node3 42613 1727204585.30329: done getting next task for host managed-node3 42613 1727204585.30334: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 42613 1727204585.30339: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204585.30356: getting variables 42613 1727204585.30358: in VariableManager get_vars() 42613 1727204585.30405: Calling all_inventory to load vars for managed-node3 42613 1727204585.30408: Calling groups_inventory to load vars for managed-node3 42613 1727204585.30411: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204585.30424: Calling all_plugins_play to load vars for managed-node3 42613 1727204585.30428: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204585.30431: Calling groups_plugins_play to load vars for managed-node3 42613 1727204585.31207: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d3 42613 1727204585.31212: WORKER PROCESS EXITING 42613 1727204585.31299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204585.31550: done with get_vars() 42613 1727204585.31564: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.045) 0:00:13.924 ***** 42613 1727204585.31669: entering _queue_task() for managed-node3/service_facts 42613 1727204585.31671: Creating lock for service_facts 42613 1727204585.32023: worker is 1 (out of 1 available) 42613 1727204585.32038: exiting _queue_task() for managed-node3/service_facts 42613 1727204585.32052: done queuing things up, now waiting for results queue to drain 42613 1727204585.32054: waiting for pending results... 42613 1727204585.32361: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 42613 1727204585.32550: in run() - task 127b8e07-fff9-2f91-05d8-0000000003d5 42613 1727204585.32575: variable 'ansible_search_path' from source: unknown 42613 1727204585.32584: variable 'ansible_search_path' from source: unknown 42613 1727204585.32630: calling self._execute() 42613 1727204585.32728: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204585.32747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204585.32764: variable 'omit' from source: magic vars 42613 1727204585.33272: variable 'ansible_distribution_major_version' from source: facts 42613 1727204585.33276: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204585.33279: variable 'omit' from source: magic vars 42613 1727204585.33306: variable 'omit' from source: magic vars 42613 1727204585.33350: variable 'omit' from source: magic vars 42613 1727204585.33409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204585.33456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204585.33487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204585.33520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204585.33538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204585.33578: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204585.33586: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204585.33595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204585.33718: Set connection var ansible_shell_executable to /bin/sh 42613 1727204585.33731: Set connection var ansible_pipelining to False 42613 1727204585.33742: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204585.33872: Set connection var ansible_connection to ssh 42613 1727204585.33875: Set connection var ansible_timeout to 10 42613 1727204585.33877: Set connection var ansible_shell_type to sh 42613 1727204585.33879: variable 'ansible_shell_executable' from source: unknown 42613 1727204585.33882: variable 'ansible_connection' from source: unknown 42613 1727204585.33884: variable 'ansible_module_compression' from source: unknown 42613 1727204585.33886: variable 'ansible_shell_type' from source: unknown 42613 1727204585.33888: variable 'ansible_shell_executable' from source: unknown 42613 1727204585.33890: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204585.33892: variable 'ansible_pipelining' from source: unknown 42613 1727204585.33894: variable 'ansible_timeout' from source: unknown 42613 1727204585.33896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204585.34033: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204585.34051: variable 'omit' from source: magic vars 42613 1727204585.34060: starting attempt loop 42613 1727204585.34070: running the handler 42613 1727204585.34086: _low_level_execute_command(): starting 42613 1727204585.34096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204585.34885: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204585.34909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204585.34995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204585.35038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204585.35063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204585.35080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204585.35190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204585.37008: stdout chunk (state=3): >>>/root <<< 42613 1727204585.37210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204585.37213: stdout chunk (state=3): >>><<< 42613 1727204585.37216: stderr chunk (state=3): >>><<< 42613 1727204585.37333: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204585.37337: _low_level_execute_command(): starting 42613 1727204585.37340: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942 `" && echo ansible-tmp-1727204585.3724065-43955-7724762955942="` echo /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942 `" ) && sleep 0' 42613 1727204585.37986: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204585.38102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204585.38143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204585.38258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204585.40410: stdout chunk (state=3): >>>ansible-tmp-1727204585.3724065-43955-7724762955942=/root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942 <<< 42613 1727204585.40640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204585.40645: stdout chunk (state=3): >>><<< 42613 1727204585.40647: stderr chunk (state=3): >>><<< 42613 1727204585.40873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204585.3724065-43955-7724762955942=/root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204585.40877: variable 'ansible_module_compression' from source: unknown 42613 1727204585.40880: ANSIBALLZ: Using lock for service_facts 42613 1727204585.40882: ANSIBALLZ: Acquiring lock 42613 1727204585.40884: ANSIBALLZ: Lock acquired: 139982756658640 42613 1727204585.40886: ANSIBALLZ: Creating module 42613 1727204585.57654: ANSIBALLZ: Writing module into payload 42613 1727204585.57779: ANSIBALLZ: Writing module 42613 1727204585.57810: ANSIBALLZ: Renaming module 42613 1727204585.57873: ANSIBALLZ: Done creating module 42613 1727204585.57878: variable 'ansible_facts' from source: unknown 42613 1727204585.57940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/AnsiballZ_service_facts.py 42613 1727204585.58113: Sending initial data 42613 1727204585.58213: Sent initial data (160 bytes) 42613 1727204585.58828: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204585.58962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204585.58974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204585.58994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204585.59101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204585.60877: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 42613 1727204585.60912: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204585.60990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204585.61087: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp2rugxuz2 /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/AnsiballZ_service_facts.py <<< 42613 1727204585.61091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/AnsiballZ_service_facts.py" <<< 42613 1727204585.61164: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp2rugxuz2" to remote "/root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/AnsiballZ_service_facts.py" <<< 42613 1727204585.62273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204585.62278: stderr chunk (state=3): >>><<< 42613 1727204585.62280: stdout chunk (state=3): >>><<< 42613 1727204585.62282: done transferring module to remote 42613 1727204585.62284: _low_level_execute_command(): starting 42613 1727204585.62286: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/ /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/AnsiballZ_service_facts.py && sleep 0' 42613 1727204585.62872: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204585.62883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204585.62896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204585.62912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204585.62923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204585.62930: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204585.62941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204585.62960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204585.62992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204585.62996: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204585.62999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204585.63014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204585.63031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204585.63050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204585.63060: stderr chunk (state=3): >>>debug2: match found <<< 42613 1727204585.63077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204585.63159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204585.63180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204585.63289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204585.65406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204585.65411: stdout chunk (state=3): >>><<< 42613 1727204585.65414: stderr chunk (state=3): >>><<< 42613 1727204585.65432: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204585.65473: _low_level_execute_command(): starting 42613 1727204585.65477: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/AnsiballZ_service_facts.py && sleep 0' 42613 1727204585.66174: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204585.66189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204585.66208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204585.66328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204585.66332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204585.66359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204585.66484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204588.08121: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "ply<<< 42613 1727204588.08140: stdout chunk (state=3): >>>mouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status":<<< 42613 1727204588.08144: stdout chunk (state=3): >>> "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 42613 1727204588.09574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204588.09703: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 42613 1727204588.09871: stdout chunk (state=3): >>><<< 42613 1727204588.09876: stderr chunk (state=3): >>><<< 42613 1727204588.09881: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204588.11454: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204588.11480: _low_level_execute_command(): starting 42613 1727204588.11490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204585.3724065-43955-7724762955942/ > /dev/null 2>&1 && sleep 0' 42613 1727204588.12190: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204588.12242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204588.12261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204588.12346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204588.12360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204588.12389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204588.12406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204588.12513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204588.19388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204588.19441: stderr chunk (state=3): >>><<< 42613 1727204588.19456: stdout chunk (state=3): >>><<< 42613 1727204588.19484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204588.19499: handler run complete 42613 1727204588.19741: variable 'ansible_facts' from source: unknown 42613 1727204588.19943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204588.20475: variable 'ansible_facts' from source: unknown 42613 1727204588.20610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204588.20764: attempt loop complete, returning result 42613 1727204588.20770: _execute() done 42613 1727204588.20772: dumping result to json 42613 1727204588.20817: done dumping result, returning 42613 1727204588.20829: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-2f91-05d8-0000000003d5] 42613 1727204588.20832: sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d5 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204588.22124: no more pending results, returning what we have 42613 1727204588.22129: results queue empty 42613 1727204588.22130: checking for any_errors_fatal 42613 1727204588.22134: done checking for any_errors_fatal 42613 1727204588.22135: checking for max_fail_percentage 42613 1727204588.22137: done checking for max_fail_percentage 42613 1727204588.22138: checking to see if all hosts have failed and the running result is not ok 42613 1727204588.22139: done checking to see if all hosts have failed 42613 1727204588.22139: getting the remaining hosts for this loop 42613 1727204588.22141: done getting the remaining hosts for this loop 42613 1727204588.22145: getting the next task for host managed-node3 42613 1727204588.22150: done getting next task for host managed-node3 42613 1727204588.22154: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 42613 1727204588.22158: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204588.22175: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d5 42613 1727204588.22179: WORKER PROCESS EXITING 42613 1727204588.22186: getting variables 42613 1727204588.22187: in VariableManager get_vars() 42613 1727204588.22220: Calling all_inventory to load vars for managed-node3 42613 1727204588.22222: Calling groups_inventory to load vars for managed-node3 42613 1727204588.22225: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204588.22235: Calling all_plugins_play to load vars for managed-node3 42613 1727204588.22238: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204588.22241: Calling groups_plugins_play to load vars for managed-node3 42613 1727204588.22770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204588.23135: done with get_vars() 42613 1727204588.23149: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:08 -0400 (0:00:02.915) 0:00:16.840 ***** 42613 1727204588.23235: entering _queue_task() for managed-node3/package_facts 42613 1727204588.23237: Creating lock for package_facts 42613 1727204588.23511: worker is 1 (out of 1 available) 42613 1727204588.23527: exiting _queue_task() for managed-node3/package_facts 42613 1727204588.23540: done queuing things up, now waiting for results queue to drain 42613 1727204588.23542: waiting for pending results... 42613 1727204588.23732: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 42613 1727204588.23857: in run() - task 127b8e07-fff9-2f91-05d8-0000000003d6 42613 1727204588.23870: variable 'ansible_search_path' from source: unknown 42613 1727204588.23874: variable 'ansible_search_path' from source: unknown 42613 1727204588.23909: calling self._execute() 42613 1727204588.23981: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204588.23987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204588.23995: variable 'omit' from source: magic vars 42613 1727204588.24358: variable 'ansible_distribution_major_version' from source: facts 42613 1727204588.24473: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204588.24476: variable 'omit' from source: magic vars 42613 1727204588.24491: variable 'omit' from source: magic vars 42613 1727204588.24533: variable 'omit' from source: magic vars 42613 1727204588.24583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204588.24632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204588.24661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204588.24692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204588.24708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204588.24743: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204588.24752: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204588.24760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204588.24879: Set connection var ansible_shell_executable to /bin/sh 42613 1727204588.24890: Set connection var ansible_pipelining to False 42613 1727204588.24903: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204588.24911: Set connection var ansible_connection to ssh 42613 1727204588.24921: Set connection var ansible_timeout to 10 42613 1727204588.24926: Set connection var ansible_shell_type to sh 42613 1727204588.24958: variable 'ansible_shell_executable' from source: unknown 42613 1727204588.25171: variable 'ansible_connection' from source: unknown 42613 1727204588.25174: variable 'ansible_module_compression' from source: unknown 42613 1727204588.25177: variable 'ansible_shell_type' from source: unknown 42613 1727204588.25180: variable 'ansible_shell_executable' from source: unknown 42613 1727204588.25182: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204588.25185: variable 'ansible_pipelining' from source: unknown 42613 1727204588.25188: variable 'ansible_timeout' from source: unknown 42613 1727204588.25190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204588.25239: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204588.25258: variable 'omit' from source: magic vars 42613 1727204588.25275: starting attempt loop 42613 1727204588.25284: running the handler 42613 1727204588.25308: _low_level_execute_command(): starting 42613 1727204588.25319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204588.26159: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204588.26264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204588.26314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204588.26500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204588.28194: stdout chunk (state=3): >>>/root <<< 42613 1727204588.28295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204588.28380: stderr chunk (state=3): >>><<< 42613 1727204588.28391: stdout chunk (state=3): >>><<< 42613 1727204588.28412: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204588.28428: _low_level_execute_command(): starting 42613 1727204588.28439: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281 `" && echo ansible-tmp-1727204588.2841873-44093-103310124336281="` echo /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281 `" ) && sleep 0' 42613 1727204588.28998: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204588.29003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204588.29086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204588.29151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204588.31314: stdout chunk (state=3): >>>ansible-tmp-1727204588.2841873-44093-103310124336281=/root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281 <<< 42613 1727204588.31581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204588.31586: stdout chunk (state=3): >>><<< 42613 1727204588.31588: stderr chunk (state=3): >>><<< 42613 1727204588.31599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204588.2841873-44093-103310124336281=/root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204588.31670: variable 'ansible_module_compression' from source: unknown 42613 1727204588.31733: ANSIBALLZ: Using lock for package_facts 42613 1727204588.31739: ANSIBALLZ: Acquiring lock 42613 1727204588.31741: ANSIBALLZ: Lock acquired: 139982756769392 42613 1727204588.31743: ANSIBALLZ: Creating module 42613 1727204588.63486: ANSIBALLZ: Writing module into payload 42613 1727204588.63583: ANSIBALLZ: Writing module 42613 1727204588.63704: ANSIBALLZ: Renaming module 42613 1727204588.63708: ANSIBALLZ: Done creating module 42613 1727204588.63710: variable 'ansible_facts' from source: unknown 42613 1727204588.63926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/AnsiballZ_package_facts.py 42613 1727204588.64164: Sending initial data 42613 1727204588.64170: Sent initial data (162 bytes) 42613 1727204588.64891: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204588.64914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204588.65040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204588.65089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204588.65168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204588.67007: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204588.67242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204588.67247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpzc9v6z7s /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/AnsiballZ_package_facts.py <<< 42613 1727204588.67250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/AnsiballZ_package_facts.py" <<< 42613 1727204588.67288: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpzc9v6z7s" to remote "/root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/AnsiballZ_package_facts.py" <<< 42613 1727204588.69673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204588.69677: stderr chunk (state=3): >>><<< 42613 1727204588.69680: stdout chunk (state=3): >>><<< 42613 1727204588.69684: done transferring module to remote 42613 1727204588.69686: _low_level_execute_command(): starting 42613 1727204588.69689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/ /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/AnsiballZ_package_facts.py && sleep 0' 42613 1727204588.70324: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204588.70331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204588.70353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204588.70363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204588.70379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204588.70459: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204588.70462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204588.70467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204588.70470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204588.70472: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204588.70474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204588.70476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204588.70478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204588.70485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204588.70487: stderr chunk (state=3): >>>debug2: match found <<< 42613 1727204588.70490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204588.70606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204588.70609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204588.70611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204588.70690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204588.72741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204588.72856: stderr chunk (state=3): >>><<< 42613 1727204588.72877: stdout chunk (state=3): >>><<< 42613 1727204588.72989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204588.72993: _low_level_execute_command(): starting 42613 1727204588.72996: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/AnsiballZ_package_facts.py && sleep 0' 42613 1727204588.73637: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204588.73678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204588.73692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204588.73719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204588.73731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204588.73829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204588.73857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204588.73974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204589.38239: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 42613 1727204589.38259: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 42613 1727204589.38290: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 42613 1727204589.38323: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 42613 1727204589.38335: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 42613 1727204589.38382: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 42613 1727204589.38390: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 42613 1727204589.38400: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 42613 1727204589.38421: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 42613 1727204589.38430: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 42613 1727204589.38446: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 42613 1727204589.38472: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 42613 1727204589.38480: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 42613 1727204589.40684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204589.40853: stderr chunk (state=3): >>><<< 42613 1727204589.40886: stdout chunk (state=3): >>><<< 42613 1727204589.41066: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204589.47090: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204589.47390: _low_level_execute_command(): starting 42613 1727204589.47421: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204588.2841873-44093-103310124336281/ > /dev/null 2>&1 && sleep 0' 42613 1727204589.48720: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204589.48773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204589.48838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204589.49039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204589.49221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204589.51277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204589.51296: stderr chunk (state=3): >>><<< 42613 1727204589.51307: stdout chunk (state=3): >>><<< 42613 1727204589.51329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204589.51435: handler run complete 42613 1727204589.52558: variable 'ansible_facts' from source: unknown 42613 1727204589.58499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204589.60321: variable 'ansible_facts' from source: unknown 42613 1727204589.60837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204589.61605: attempt loop complete, returning result 42613 1727204589.61639: _execute() done 42613 1727204589.61643: dumping result to json 42613 1727204589.61841: done dumping result, returning 42613 1727204589.61849: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-2f91-05d8-0000000003d6] 42613 1727204589.61852: sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d6 42613 1727204589.65190: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000003d6 42613 1727204589.65193: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204589.65295: no more pending results, returning what we have 42613 1727204589.65298: results queue empty 42613 1727204589.65304: checking for any_errors_fatal 42613 1727204589.65313: done checking for any_errors_fatal 42613 1727204589.65314: checking for max_fail_percentage 42613 1727204589.65318: done checking for max_fail_percentage 42613 1727204589.65321: checking to see if all hosts have failed and the running result is not ok 42613 1727204589.65322: done checking to see if all hosts have failed 42613 1727204589.65323: getting the remaining hosts for this loop 42613 1727204589.65324: done getting the remaining hosts for this loop 42613 1727204589.65331: getting the next task for host managed-node3 42613 1727204589.65344: done getting next task for host managed-node3 42613 1727204589.65348: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 42613 1727204589.65352: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204589.65368: getting variables 42613 1727204589.65370: in VariableManager get_vars() 42613 1727204589.65413: Calling all_inventory to load vars for managed-node3 42613 1727204589.65416: Calling groups_inventory to load vars for managed-node3 42613 1727204589.65419: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204589.65430: Calling all_plugins_play to load vars for managed-node3 42613 1727204589.65432: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204589.65439: Calling groups_plugins_play to load vars for managed-node3 42613 1727204589.66463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204589.68089: done with get_vars() 42613 1727204589.68144: done getting variables 42613 1727204589.68223: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:09 -0400 (0:00:01.450) 0:00:18.290 ***** 42613 1727204589.68263: entering _queue_task() for managed-node3/debug 42613 1727204589.68657: worker is 1 (out of 1 available) 42613 1727204589.68672: exiting _queue_task() for managed-node3/debug 42613 1727204589.68685: done queuing things up, now waiting for results queue to drain 42613 1727204589.68686: waiting for pending results... 42613 1727204589.69090: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 42613 1727204589.69209: in run() - task 127b8e07-fff9-2f91-05d8-000000000018 42613 1727204589.69243: variable 'ansible_search_path' from source: unknown 42613 1727204589.69258: variable 'ansible_search_path' from source: unknown 42613 1727204589.69323: calling self._execute() 42613 1727204589.69450: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204589.69460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204589.69480: variable 'omit' from source: magic vars 42613 1727204589.70286: variable 'ansible_distribution_major_version' from source: facts 42613 1727204589.70317: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204589.70343: variable 'omit' from source: magic vars 42613 1727204589.70441: variable 'omit' from source: magic vars 42613 1727204589.70522: variable 'network_provider' from source: set_fact 42613 1727204589.70535: variable 'omit' from source: magic vars 42613 1727204589.70580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204589.70611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204589.70679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204589.70684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204589.70686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204589.70715: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204589.70718: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204589.70721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204589.70843: Set connection var ansible_shell_executable to /bin/sh 42613 1727204589.70849: Set connection var ansible_pipelining to False 42613 1727204589.70860: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204589.70901: Set connection var ansible_connection to ssh 42613 1727204589.70904: Set connection var ansible_timeout to 10 42613 1727204589.70906: Set connection var ansible_shell_type to sh 42613 1727204589.70914: variable 'ansible_shell_executable' from source: unknown 42613 1727204589.70917: variable 'ansible_connection' from source: unknown 42613 1727204589.70920: variable 'ansible_module_compression' from source: unknown 42613 1727204589.70922: variable 'ansible_shell_type' from source: unknown 42613 1727204589.70925: variable 'ansible_shell_executable' from source: unknown 42613 1727204589.70930: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204589.70932: variable 'ansible_pipelining' from source: unknown 42613 1727204589.70940: variable 'ansible_timeout' from source: unknown 42613 1727204589.70943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204589.71107: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204589.71110: variable 'omit' from source: magic vars 42613 1727204589.71113: starting attempt loop 42613 1727204589.71116: running the handler 42613 1727204589.71158: handler run complete 42613 1727204589.71195: attempt loop complete, returning result 42613 1727204589.71199: _execute() done 42613 1727204589.71202: dumping result to json 42613 1727204589.71209: done dumping result, returning 42613 1727204589.71212: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-2f91-05d8-000000000018] 42613 1727204589.71214: sending task result for task 127b8e07-fff9-2f91-05d8-000000000018 42613 1727204589.71314: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000018 42613 1727204589.71317: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 42613 1727204589.71426: no more pending results, returning what we have 42613 1727204589.71430: results queue empty 42613 1727204589.71431: checking for any_errors_fatal 42613 1727204589.71439: done checking for any_errors_fatal 42613 1727204589.71440: checking for max_fail_percentage 42613 1727204589.71444: done checking for max_fail_percentage 42613 1727204589.71444: checking to see if all hosts have failed and the running result is not ok 42613 1727204589.71445: done checking to see if all hosts have failed 42613 1727204589.71446: getting the remaining hosts for this loop 42613 1727204589.71448: done getting the remaining hosts for this loop 42613 1727204589.71452: getting the next task for host managed-node3 42613 1727204589.71460: done getting next task for host managed-node3 42613 1727204589.71464: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 42613 1727204589.71469: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204589.71480: getting variables 42613 1727204589.71481: in VariableManager get_vars() 42613 1727204589.71516: Calling all_inventory to load vars for managed-node3 42613 1727204589.71519: Calling groups_inventory to load vars for managed-node3 42613 1727204589.71521: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204589.71602: Calling all_plugins_play to load vars for managed-node3 42613 1727204589.71611: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204589.71621: Calling groups_plugins_play to load vars for managed-node3 42613 1727204589.73026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204589.75654: done with get_vars() 42613 1727204589.75715: done getting variables 42613 1727204589.75789: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.075) 0:00:18.366 ***** 42613 1727204589.75826: entering _queue_task() for managed-node3/fail 42613 1727204589.76219: worker is 1 (out of 1 available) 42613 1727204589.76240: exiting _queue_task() for managed-node3/fail 42613 1727204589.76253: done queuing things up, now waiting for results queue to drain 42613 1727204589.76255: waiting for pending results... 42613 1727204589.76690: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 42613 1727204589.76762: in run() - task 127b8e07-fff9-2f91-05d8-000000000019 42613 1727204589.76789: variable 'ansible_search_path' from source: unknown 42613 1727204589.76797: variable 'ansible_search_path' from source: unknown 42613 1727204589.76841: calling self._execute() 42613 1727204589.76950: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204589.76964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204589.76983: variable 'omit' from source: magic vars 42613 1727204589.77713: variable 'ansible_distribution_major_version' from source: facts 42613 1727204589.77734: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204589.78216: variable 'network_state' from source: role '' defaults 42613 1727204589.78327: Evaluated conditional (network_state != {}): False 42613 1727204589.78331: when evaluation is False, skipping this task 42613 1727204589.78334: _execute() done 42613 1727204589.78336: dumping result to json 42613 1727204589.78339: done dumping result, returning 42613 1727204589.78341: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-2f91-05d8-000000000019] 42613 1727204589.78344: sending task result for task 127b8e07-fff9-2f91-05d8-000000000019 42613 1727204589.78576: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000019 42613 1727204589.78580: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204589.78643: no more pending results, returning what we have 42613 1727204589.78647: results queue empty 42613 1727204589.78649: checking for any_errors_fatal 42613 1727204589.78656: done checking for any_errors_fatal 42613 1727204589.78657: checking for max_fail_percentage 42613 1727204589.78659: done checking for max_fail_percentage 42613 1727204589.78660: checking to see if all hosts have failed and the running result is not ok 42613 1727204589.78661: done checking to see if all hosts have failed 42613 1727204589.78662: getting the remaining hosts for this loop 42613 1727204589.78663: done getting the remaining hosts for this loop 42613 1727204589.78670: getting the next task for host managed-node3 42613 1727204589.78678: done getting next task for host managed-node3 42613 1727204589.78683: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 42613 1727204589.78687: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204589.78706: getting variables 42613 1727204589.78708: in VariableManager get_vars() 42613 1727204589.78755: Calling all_inventory to load vars for managed-node3 42613 1727204589.78758: Calling groups_inventory to load vars for managed-node3 42613 1727204589.78760: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204589.78886: Calling all_plugins_play to load vars for managed-node3 42613 1727204589.78890: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204589.78894: Calling groups_plugins_play to load vars for managed-node3 42613 1727204589.80719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204589.83204: done with get_vars() 42613 1727204589.83248: done getting variables 42613 1727204589.83316: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.075) 0:00:18.441 ***** 42613 1727204589.83354: entering _queue_task() for managed-node3/fail 42613 1727204589.83732: worker is 1 (out of 1 available) 42613 1727204589.83749: exiting _queue_task() for managed-node3/fail 42613 1727204589.83762: done queuing things up, now waiting for results queue to drain 42613 1727204589.83763: waiting for pending results... 42613 1727204589.84098: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 42613 1727204589.84302: in run() - task 127b8e07-fff9-2f91-05d8-00000000001a 42613 1727204589.84307: variable 'ansible_search_path' from source: unknown 42613 1727204589.84309: variable 'ansible_search_path' from source: unknown 42613 1727204589.84330: calling self._execute() 42613 1727204589.84440: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204589.84453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204589.84471: variable 'omit' from source: magic vars 42613 1727204589.84904: variable 'ansible_distribution_major_version' from source: facts 42613 1727204589.84959: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204589.85077: variable 'network_state' from source: role '' defaults 42613 1727204589.85095: Evaluated conditional (network_state != {}): False 42613 1727204589.85102: when evaluation is False, skipping this task 42613 1727204589.85109: _execute() done 42613 1727204589.85115: dumping result to json 42613 1727204589.85173: done dumping result, returning 42613 1727204589.85177: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-2f91-05d8-00000000001a] 42613 1727204589.85180: sending task result for task 127b8e07-fff9-2f91-05d8-00000000001a skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204589.85423: no more pending results, returning what we have 42613 1727204589.85428: results queue empty 42613 1727204589.85429: checking for any_errors_fatal 42613 1727204589.85441: done checking for any_errors_fatal 42613 1727204589.85442: checking for max_fail_percentage 42613 1727204589.85444: done checking for max_fail_percentage 42613 1727204589.85445: checking to see if all hosts have failed and the running result is not ok 42613 1727204589.85446: done checking to see if all hosts have failed 42613 1727204589.85447: getting the remaining hosts for this loop 42613 1727204589.85449: done getting the remaining hosts for this loop 42613 1727204589.85453: getting the next task for host managed-node3 42613 1727204589.85461: done getting next task for host managed-node3 42613 1727204589.85468: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 42613 1727204589.85472: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204589.85493: getting variables 42613 1727204589.85495: in VariableManager get_vars() 42613 1727204589.85543: Calling all_inventory to load vars for managed-node3 42613 1727204589.85547: Calling groups_inventory to load vars for managed-node3 42613 1727204589.85550: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204589.85774: Calling all_plugins_play to load vars for managed-node3 42613 1727204589.85779: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204589.85784: Calling groups_plugins_play to load vars for managed-node3 42613 1727204589.86484: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000001a 42613 1727204589.86488: WORKER PROCESS EXITING 42613 1727204589.87613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204589.89830: done with get_vars() 42613 1727204589.89877: done getting variables 42613 1727204589.89948: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.066) 0:00:18.508 ***** 42613 1727204589.89989: entering _queue_task() for managed-node3/fail 42613 1727204589.90603: worker is 1 (out of 1 available) 42613 1727204589.90616: exiting _queue_task() for managed-node3/fail 42613 1727204589.90626: done queuing things up, now waiting for results queue to drain 42613 1727204589.90627: waiting for pending results... 42613 1727204589.90739: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 42613 1727204589.90920: in run() - task 127b8e07-fff9-2f91-05d8-00000000001b 42613 1727204589.90957: variable 'ansible_search_path' from source: unknown 42613 1727204589.90975: variable 'ansible_search_path' from source: unknown 42613 1727204589.91024: calling self._execute() 42613 1727204589.91129: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204589.91144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204589.91159: variable 'omit' from source: magic vars 42613 1727204589.91586: variable 'ansible_distribution_major_version' from source: facts 42613 1727204589.91604: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204589.92085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204589.97474: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204589.98563: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204589.98637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204589.98691: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204589.98726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204589.98858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204589.98902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204589.98939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204589.98990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204589.99018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204589.99171: variable 'ansible_distribution_major_version' from source: facts 42613 1727204589.99174: Evaluated conditional (ansible_distribution_major_version | int > 9): True 42613 1727204589.99576: variable 'ansible_distribution' from source: facts 42613 1727204589.99588: variable '__network_rh_distros' from source: role '' defaults 42613 1727204589.99773: Evaluated conditional (ansible_distribution in __network_rh_distros): False 42613 1727204589.99777: when evaluation is False, skipping this task 42613 1727204589.99779: _execute() done 42613 1727204589.99781: dumping result to json 42613 1727204589.99783: done dumping result, returning 42613 1727204589.99786: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-2f91-05d8-00000000001b] 42613 1727204589.99788: sending task result for task 127b8e07-fff9-2f91-05d8-00000000001b skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 42613 1727204589.99927: no more pending results, returning what we have 42613 1727204589.99931: results queue empty 42613 1727204589.99932: checking for any_errors_fatal 42613 1727204589.99940: done checking for any_errors_fatal 42613 1727204589.99941: checking for max_fail_percentage 42613 1727204589.99944: done checking for max_fail_percentage 42613 1727204589.99945: checking to see if all hosts have failed and the running result is not ok 42613 1727204589.99946: done checking to see if all hosts have failed 42613 1727204589.99946: getting the remaining hosts for this loop 42613 1727204589.99948: done getting the remaining hosts for this loop 42613 1727204589.99952: getting the next task for host managed-node3 42613 1727204589.99961: done getting next task for host managed-node3 42613 1727204589.99966: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 42613 1727204589.99970: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204589.99986: getting variables 42613 1727204589.99989: in VariableManager get_vars() 42613 1727204590.00032: Calling all_inventory to load vars for managed-node3 42613 1727204590.00038: Calling groups_inventory to load vars for managed-node3 42613 1727204590.00040: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204590.00052: Calling all_plugins_play to load vars for managed-node3 42613 1727204590.00055: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204590.00057: Calling groups_plugins_play to load vars for managed-node3 42613 1727204590.01077: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000001b 42613 1727204590.01081: WORKER PROCESS EXITING 42613 1727204590.03207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204590.06113: done with get_vars() 42613 1727204590.06154: done getting variables 42613 1727204590.06280: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.163) 0:00:18.671 ***** 42613 1727204590.06317: entering _queue_task() for managed-node3/dnf 42613 1727204590.06705: worker is 1 (out of 1 available) 42613 1727204590.06720: exiting _queue_task() for managed-node3/dnf 42613 1727204590.06733: done queuing things up, now waiting for results queue to drain 42613 1727204590.06734: waiting for pending results... 42613 1727204590.07061: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 42613 1727204590.07233: in run() - task 127b8e07-fff9-2f91-05d8-00000000001c 42613 1727204590.07256: variable 'ansible_search_path' from source: unknown 42613 1727204590.07267: variable 'ansible_search_path' from source: unknown 42613 1727204590.07315: calling self._execute() 42613 1727204590.07418: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204590.07430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204590.07445: variable 'omit' from source: magic vars 42613 1727204590.07872: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.07891: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204590.08120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204590.10706: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204590.10808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204590.10859: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204590.10906: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204590.10939: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204590.11083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.11087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.11107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.11156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.11180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.11321: variable 'ansible_distribution' from source: facts 42613 1727204590.11331: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.11343: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 42613 1727204590.11514: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204590.11646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.11680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.11711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.11764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.11787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.11925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.11929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.11932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.12371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.12376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.12378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.12380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.12383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.12387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.12389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.12938: variable 'network_connections' from source: task vars 42613 1727204590.12943: variable 'interface' from source: set_fact 42613 1727204590.13153: variable 'interface' from source: set_fact 42613 1727204590.13157: variable 'interface' from source: set_fact 42613 1727204590.13159: variable 'interface' from source: set_fact 42613 1727204590.13361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204590.13808: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204590.14082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204590.14191: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204590.14195: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204590.14227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204590.14255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204590.14326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.14439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204590.14627: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204590.15165: variable 'network_connections' from source: task vars 42613 1727204590.15295: variable 'interface' from source: set_fact 42613 1727204590.15399: variable 'interface' from source: set_fact 42613 1727204590.15480: variable 'interface' from source: set_fact 42613 1727204590.15556: variable 'interface' from source: set_fact 42613 1727204590.15674: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204590.15973: when evaluation is False, skipping this task 42613 1727204590.15976: _execute() done 42613 1727204590.15978: dumping result to json 42613 1727204590.15980: done dumping result, returning 42613 1727204590.15982: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-00000000001c] 42613 1727204590.15985: sending task result for task 127b8e07-fff9-2f91-05d8-00000000001c 42613 1727204590.16062: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000001c 42613 1727204590.16067: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204590.16137: no more pending results, returning what we have 42613 1727204590.16141: results queue empty 42613 1727204590.16142: checking for any_errors_fatal 42613 1727204590.16149: done checking for any_errors_fatal 42613 1727204590.16150: checking for max_fail_percentage 42613 1727204590.16153: done checking for max_fail_percentage 42613 1727204590.16153: checking to see if all hosts have failed and the running result is not ok 42613 1727204590.16154: done checking to see if all hosts have failed 42613 1727204590.16155: getting the remaining hosts for this loop 42613 1727204590.16157: done getting the remaining hosts for this loop 42613 1727204590.16162: getting the next task for host managed-node3 42613 1727204590.16172: done getting next task for host managed-node3 42613 1727204590.16177: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 42613 1727204590.16180: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204590.16198: getting variables 42613 1727204590.16199: in VariableManager get_vars() 42613 1727204590.16240: Calling all_inventory to load vars for managed-node3 42613 1727204590.16243: Calling groups_inventory to load vars for managed-node3 42613 1727204590.16246: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204590.16258: Calling all_plugins_play to load vars for managed-node3 42613 1727204590.16261: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204590.16264: Calling groups_plugins_play to load vars for managed-node3 42613 1727204590.18332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204590.27658: done with get_vars() 42613 1727204590.27907: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 42613 1727204590.27984: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.216) 0:00:18.888 ***** 42613 1727204590.28017: entering _queue_task() for managed-node3/yum 42613 1727204590.28019: Creating lock for yum 42613 1727204590.28771: worker is 1 (out of 1 available) 42613 1727204590.28783: exiting _queue_task() for managed-node3/yum 42613 1727204590.28794: done queuing things up, now waiting for results queue to drain 42613 1727204590.28796: waiting for pending results... 42613 1727204590.29588: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 42613 1727204590.30575: in run() - task 127b8e07-fff9-2f91-05d8-00000000001d 42613 1727204590.30582: variable 'ansible_search_path' from source: unknown 42613 1727204590.30587: variable 'ansible_search_path' from source: unknown 42613 1727204590.30592: calling self._execute() 42613 1727204590.30719: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204590.30735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204590.30972: variable 'omit' from source: magic vars 42613 1727204590.31793: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.31798: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204590.32220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204590.37799: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204590.37912: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204590.37962: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204590.38018: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204590.38055: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204590.38175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.38223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.38259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.38316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.38416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.38476: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.38498: Evaluated conditional (ansible_distribution_major_version | int < 8): False 42613 1727204590.38505: when evaluation is False, skipping this task 42613 1727204590.38512: _execute() done 42613 1727204590.38523: dumping result to json 42613 1727204590.38532: done dumping result, returning 42613 1727204590.38553: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-00000000001d] 42613 1727204590.38567: sending task result for task 127b8e07-fff9-2f91-05d8-00000000001d 42613 1727204590.38814: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000001d 42613 1727204590.38818: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 42613 1727204590.38884: no more pending results, returning what we have 42613 1727204590.38888: results queue empty 42613 1727204590.38889: checking for any_errors_fatal 42613 1727204590.38897: done checking for any_errors_fatal 42613 1727204590.38898: checking for max_fail_percentage 42613 1727204590.38900: done checking for max_fail_percentage 42613 1727204590.38901: checking to see if all hosts have failed and the running result is not ok 42613 1727204590.38902: done checking to see if all hosts have failed 42613 1727204590.38903: getting the remaining hosts for this loop 42613 1727204590.38905: done getting the remaining hosts for this loop 42613 1727204590.38909: getting the next task for host managed-node3 42613 1727204590.38917: done getting next task for host managed-node3 42613 1727204590.38922: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 42613 1727204590.38925: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204590.38944: getting variables 42613 1727204590.38946: in VariableManager get_vars() 42613 1727204590.38990: Calling all_inventory to load vars for managed-node3 42613 1727204590.38993: Calling groups_inventory to load vars for managed-node3 42613 1727204590.38995: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204590.39007: Calling all_plugins_play to load vars for managed-node3 42613 1727204590.39010: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204590.39013: Calling groups_plugins_play to load vars for managed-node3 42613 1727204590.42783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204590.48337: done with get_vars() 42613 1727204590.48458: done getting variables 42613 1727204590.48533: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.207) 0:00:19.095 ***** 42613 1727204590.48763: entering _queue_task() for managed-node3/fail 42613 1727204590.49451: worker is 1 (out of 1 available) 42613 1727204590.49464: exiting _queue_task() for managed-node3/fail 42613 1727204590.49479: done queuing things up, now waiting for results queue to drain 42613 1727204590.49480: waiting for pending results... 42613 1727204590.50215: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 42613 1727204590.50403: in run() - task 127b8e07-fff9-2f91-05d8-00000000001e 42613 1727204590.50462: variable 'ansible_search_path' from source: unknown 42613 1727204590.50469: variable 'ansible_search_path' from source: unknown 42613 1727204590.50612: calling self._execute() 42613 1727204590.50637: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204590.50641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204590.50645: variable 'omit' from source: magic vars 42613 1727204590.52086: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.52091: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204590.52564: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204590.53205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204590.56712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204590.57091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204590.57122: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204590.57153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204590.57199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204590.57308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.57375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.57382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.57625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.57742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.57910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.58089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.58093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.58224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.58331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.58471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.58651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.58654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.58682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.58703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.58947: variable 'network_connections' from source: task vars 42613 1727204590.58973: variable 'interface' from source: set_fact 42613 1727204590.59074: variable 'interface' from source: set_fact 42613 1727204590.59102: variable 'interface' from source: set_fact 42613 1727204590.59182: variable 'interface' from source: set_fact 42613 1727204590.59272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204590.59449: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204590.59520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204590.59559: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204590.59608: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204590.59667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204590.59686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204590.59705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.59724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204590.59780: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204590.60054: variable 'network_connections' from source: task vars 42613 1727204590.60060: variable 'interface' from source: set_fact 42613 1727204590.60142: variable 'interface' from source: set_fact 42613 1727204590.60145: variable 'interface' from source: set_fact 42613 1727204590.60206: variable 'interface' from source: set_fact 42613 1727204590.60252: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204590.60255: when evaluation is False, skipping this task 42613 1727204590.60258: _execute() done 42613 1727204590.60261: dumping result to json 42613 1727204590.60263: done dumping result, returning 42613 1727204590.60274: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-00000000001e] 42613 1727204590.60284: sending task result for task 127b8e07-fff9-2f91-05d8-00000000001e 42613 1727204590.60405: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000001e 42613 1727204590.60408: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204590.60478: no more pending results, returning what we have 42613 1727204590.60482: results queue empty 42613 1727204590.60483: checking for any_errors_fatal 42613 1727204590.60491: done checking for any_errors_fatal 42613 1727204590.60492: checking for max_fail_percentage 42613 1727204590.60494: done checking for max_fail_percentage 42613 1727204590.60495: checking to see if all hosts have failed and the running result is not ok 42613 1727204590.60496: done checking to see if all hosts have failed 42613 1727204590.60496: getting the remaining hosts for this loop 42613 1727204590.60498: done getting the remaining hosts for this loop 42613 1727204590.60502: getting the next task for host managed-node3 42613 1727204590.60509: done getting next task for host managed-node3 42613 1727204590.60513: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 42613 1727204590.60516: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204590.60533: getting variables 42613 1727204590.60535: in VariableManager get_vars() 42613 1727204590.60576: Calling all_inventory to load vars for managed-node3 42613 1727204590.60579: Calling groups_inventory to load vars for managed-node3 42613 1727204590.60581: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204590.60590: Calling all_plugins_play to load vars for managed-node3 42613 1727204590.60592: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204590.60595: Calling groups_plugins_play to load vars for managed-node3 42613 1727204590.62753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204590.64229: done with get_vars() 42613 1727204590.64268: done getting variables 42613 1727204590.64318: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.155) 0:00:19.251 ***** 42613 1727204590.64355: entering _queue_task() for managed-node3/package 42613 1727204590.65037: worker is 1 (out of 1 available) 42613 1727204590.65049: exiting _queue_task() for managed-node3/package 42613 1727204590.65060: done queuing things up, now waiting for results queue to drain 42613 1727204590.65061: waiting for pending results... 42613 1727204590.65358: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 42613 1727204590.65634: in run() - task 127b8e07-fff9-2f91-05d8-00000000001f 42613 1727204590.65639: variable 'ansible_search_path' from source: unknown 42613 1727204590.65642: variable 'ansible_search_path' from source: unknown 42613 1727204590.65644: calling self._execute() 42613 1727204590.65900: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204590.65905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204590.65908: variable 'omit' from source: magic vars 42613 1727204590.66405: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.66414: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204590.66705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204590.67082: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204590.67130: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204590.67163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204590.67237: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204590.67344: variable 'network_packages' from source: role '' defaults 42613 1727204590.67624: variable '__network_provider_setup' from source: role '' defaults 42613 1727204590.67628: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204590.67668: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204590.67676: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204590.67727: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204590.67915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204590.71236: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204590.71294: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204590.71326: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204590.71355: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204590.71383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204590.71463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.71486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.71505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.71540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.71551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.71591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.71608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.71631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.71661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.71674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.71837: variable '__network_packages_default_gobject_packages' from source: role '' defaults 42613 1727204590.71927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.71947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.71968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.71999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.72010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.72085: variable 'ansible_python' from source: facts 42613 1727204590.72107: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 42613 1727204590.72173: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204590.72231: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204590.72327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.72347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.72366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.72396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.72407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.72448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204590.72469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204590.72488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.72516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204590.72529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204590.72636: variable 'network_connections' from source: task vars 42613 1727204590.72643: variable 'interface' from source: set_fact 42613 1727204590.72717: variable 'interface' from source: set_fact 42613 1727204590.72726: variable 'interface' from source: set_fact 42613 1727204590.72801: variable 'interface' from source: set_fact 42613 1727204590.72881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204590.72900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204590.72922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204590.72950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204590.72990: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204590.73285: variable 'network_connections' from source: task vars 42613 1727204590.73301: variable 'interface' from source: set_fact 42613 1727204590.73471: variable 'interface' from source: set_fact 42613 1727204590.73475: variable 'interface' from source: set_fact 42613 1727204590.73478: variable 'interface' from source: set_fact 42613 1727204590.73586: variable '__network_packages_default_wireless' from source: role '' defaults 42613 1727204590.73773: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204590.73950: variable 'network_connections' from source: task vars 42613 1727204590.73953: variable 'interface' from source: set_fact 42613 1727204590.74022: variable 'interface' from source: set_fact 42613 1727204590.74025: variable 'interface' from source: set_fact 42613 1727204590.74101: variable 'interface' from source: set_fact 42613 1727204590.74145: variable '__network_packages_default_team' from source: role '' defaults 42613 1727204590.74225: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204590.74532: variable 'network_connections' from source: task vars 42613 1727204590.74536: variable 'interface' from source: set_fact 42613 1727204590.74601: variable 'interface' from source: set_fact 42613 1727204590.74607: variable 'interface' from source: set_fact 42613 1727204590.74676: variable 'interface' from source: set_fact 42613 1727204590.74772: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204590.74831: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204590.74841: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204590.74902: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204590.75125: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 42613 1727204590.75759: variable 'network_connections' from source: task vars 42613 1727204590.75762: variable 'interface' from source: set_fact 42613 1727204590.75839: variable 'interface' from source: set_fact 42613 1727204590.75857: variable 'interface' from source: set_fact 42613 1727204590.75955: variable 'interface' from source: set_fact 42613 1727204590.76003: variable 'ansible_distribution' from source: facts 42613 1727204590.76013: variable '__network_rh_distros' from source: role '' defaults 42613 1727204590.76024: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.76062: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 42613 1727204590.76293: variable 'ansible_distribution' from source: facts 42613 1727204590.76297: variable '__network_rh_distros' from source: role '' defaults 42613 1727204590.76300: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.76309: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 42613 1727204590.76603: variable 'ansible_distribution' from source: facts 42613 1727204590.76607: variable '__network_rh_distros' from source: role '' defaults 42613 1727204590.76609: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.76611: variable 'network_provider' from source: set_fact 42613 1727204590.76618: variable 'ansible_facts' from source: unknown 42613 1727204590.77835: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 42613 1727204590.77846: when evaluation is False, skipping this task 42613 1727204590.77854: _execute() done 42613 1727204590.77862: dumping result to json 42613 1727204590.77873: done dumping result, returning 42613 1727204590.77888: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-2f91-05d8-00000000001f] 42613 1727204590.77945: sending task result for task 127b8e07-fff9-2f91-05d8-00000000001f skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 42613 1727204590.78321: no more pending results, returning what we have 42613 1727204590.78325: results queue empty 42613 1727204590.78325: checking for any_errors_fatal 42613 1727204590.78334: done checking for any_errors_fatal 42613 1727204590.78335: checking for max_fail_percentage 42613 1727204590.78337: done checking for max_fail_percentage 42613 1727204590.78337: checking to see if all hosts have failed and the running result is not ok 42613 1727204590.78338: done checking to see if all hosts have failed 42613 1727204590.78339: getting the remaining hosts for this loop 42613 1727204590.78341: done getting the remaining hosts for this loop 42613 1727204590.78345: getting the next task for host managed-node3 42613 1727204590.78351: done getting next task for host managed-node3 42613 1727204590.78356: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 42613 1727204590.78359: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204590.78491: getting variables 42613 1727204590.78493: in VariableManager get_vars() 42613 1727204590.78530: Calling all_inventory to load vars for managed-node3 42613 1727204590.78540: Calling groups_inventory to load vars for managed-node3 42613 1727204590.78542: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204590.78553: Calling all_plugins_play to load vars for managed-node3 42613 1727204590.78556: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204590.78559: Calling groups_plugins_play to load vars for managed-node3 42613 1727204590.79102: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000001f 42613 1727204590.79106: WORKER PROCESS EXITING 42613 1727204590.80314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204590.82707: done with get_vars() 42613 1727204590.82744: done getting variables 42613 1727204590.82819: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.185) 0:00:19.436 ***** 42613 1727204590.82861: entering _queue_task() for managed-node3/package 42613 1727204590.83252: worker is 1 (out of 1 available) 42613 1727204590.83473: exiting _queue_task() for managed-node3/package 42613 1727204590.83485: done queuing things up, now waiting for results queue to drain 42613 1727204590.83487: waiting for pending results... 42613 1727204590.83611: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 42613 1727204590.83775: in run() - task 127b8e07-fff9-2f91-05d8-000000000020 42613 1727204590.83803: variable 'ansible_search_path' from source: unknown 42613 1727204590.83839: variable 'ansible_search_path' from source: unknown 42613 1727204590.83864: calling self._execute() 42613 1727204590.83983: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204590.84020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204590.84024: variable 'omit' from source: magic vars 42613 1727204590.84471: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.84494: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204590.84673: variable 'network_state' from source: role '' defaults 42613 1727204590.84677: Evaluated conditional (network_state != {}): False 42613 1727204590.84680: when evaluation is False, skipping this task 42613 1727204590.84683: _execute() done 42613 1727204590.84686: dumping result to json 42613 1727204590.84688: done dumping result, returning 42613 1727204590.84695: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-2f91-05d8-000000000020] 42613 1727204590.84712: sending task result for task 127b8e07-fff9-2f91-05d8-000000000020 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204590.84993: no more pending results, returning what we have 42613 1727204590.85001: results queue empty 42613 1727204590.85002: checking for any_errors_fatal 42613 1727204590.85010: done checking for any_errors_fatal 42613 1727204590.85010: checking for max_fail_percentage 42613 1727204590.85013: done checking for max_fail_percentage 42613 1727204590.85014: checking to see if all hosts have failed and the running result is not ok 42613 1727204590.85015: done checking to see if all hosts have failed 42613 1727204590.85016: getting the remaining hosts for this loop 42613 1727204590.85018: done getting the remaining hosts for this loop 42613 1727204590.85022: getting the next task for host managed-node3 42613 1727204590.85030: done getting next task for host managed-node3 42613 1727204590.85035: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 42613 1727204590.85039: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204590.85057: getting variables 42613 1727204590.85060: in VariableManager get_vars() 42613 1727204590.85271: Calling all_inventory to load vars for managed-node3 42613 1727204590.85275: Calling groups_inventory to load vars for managed-node3 42613 1727204590.85278: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204590.85288: Calling all_plugins_play to load vars for managed-node3 42613 1727204590.85291: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204590.85295: Calling groups_plugins_play to load vars for managed-node3 42613 1727204590.85892: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000020 42613 1727204590.85896: WORKER PROCESS EXITING 42613 1727204590.87071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204590.89381: done with get_vars() 42613 1727204590.89424: done getting variables 42613 1727204590.89493: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.066) 0:00:19.503 ***** 42613 1727204590.89536: entering _queue_task() for managed-node3/package 42613 1727204590.89926: worker is 1 (out of 1 available) 42613 1727204590.90055: exiting _queue_task() for managed-node3/package 42613 1727204590.90070: done queuing things up, now waiting for results queue to drain 42613 1727204590.90072: waiting for pending results... 42613 1727204590.90279: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 42613 1727204590.90508: in run() - task 127b8e07-fff9-2f91-05d8-000000000021 42613 1727204590.90514: variable 'ansible_search_path' from source: unknown 42613 1727204590.90518: variable 'ansible_search_path' from source: unknown 42613 1727204590.90617: calling self._execute() 42613 1727204590.90687: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204590.90701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204590.90722: variable 'omit' from source: magic vars 42613 1727204590.91191: variable 'ansible_distribution_major_version' from source: facts 42613 1727204590.91213: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204590.91358: variable 'network_state' from source: role '' defaults 42613 1727204590.91381: Evaluated conditional (network_state != {}): False 42613 1727204590.91415: when evaluation is False, skipping this task 42613 1727204590.91419: _execute() done 42613 1727204590.91422: dumping result to json 42613 1727204590.91424: done dumping result, returning 42613 1727204590.91427: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-2f91-05d8-000000000021] 42613 1727204590.91435: sending task result for task 127b8e07-fff9-2f91-05d8-000000000021 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204590.91720: no more pending results, returning what we have 42613 1727204590.91724: results queue empty 42613 1727204590.91725: checking for any_errors_fatal 42613 1727204590.91736: done checking for any_errors_fatal 42613 1727204590.91737: checking for max_fail_percentage 42613 1727204590.91739: done checking for max_fail_percentage 42613 1727204590.91741: checking to see if all hosts have failed and the running result is not ok 42613 1727204590.91742: done checking to see if all hosts have failed 42613 1727204590.91742: getting the remaining hosts for this loop 42613 1727204590.91744: done getting the remaining hosts for this loop 42613 1727204590.91748: getting the next task for host managed-node3 42613 1727204590.91757: done getting next task for host managed-node3 42613 1727204590.91761: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 42613 1727204590.91764: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204590.91977: getting variables 42613 1727204590.91979: in VariableManager get_vars() 42613 1727204590.92019: Calling all_inventory to load vars for managed-node3 42613 1727204590.92022: Calling groups_inventory to load vars for managed-node3 42613 1727204590.92025: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204590.92034: Calling all_plugins_play to load vars for managed-node3 42613 1727204590.92037: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204590.92040: Calling groups_plugins_play to load vars for managed-node3 42613 1727204590.92585: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000021 42613 1727204590.92589: WORKER PROCESS EXITING 42613 1727204590.94628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204590.98401: done with get_vars() 42613 1727204590.98440: done getting variables 42613 1727204590.98563: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.092) 0:00:19.596 ***** 42613 1727204590.98807: entering _queue_task() for managed-node3/service 42613 1727204590.98809: Creating lock for service 42613 1727204590.99593: worker is 1 (out of 1 available) 42613 1727204590.99606: exiting _queue_task() for managed-node3/service 42613 1727204590.99618: done queuing things up, now waiting for results queue to drain 42613 1727204590.99620: waiting for pending results... 42613 1727204591.00186: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 42613 1727204591.00341: in run() - task 127b8e07-fff9-2f91-05d8-000000000022 42613 1727204591.00492: variable 'ansible_search_path' from source: unknown 42613 1727204591.00672: variable 'ansible_search_path' from source: unknown 42613 1727204591.00675: calling self._execute() 42613 1727204591.00679: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204591.00683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204591.00853: variable 'omit' from source: magic vars 42613 1727204591.01792: variable 'ansible_distribution_major_version' from source: facts 42613 1727204591.01811: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204591.01950: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204591.02591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204591.07857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204591.07951: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204591.08372: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204591.08376: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204591.08378: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204591.08381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.08503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.08871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.08874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.08877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.08881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.08884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.08887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.09030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.09052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.09107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.09137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.09170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.09218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.09238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.09433: variable 'network_connections' from source: task vars 42613 1727204591.09453: variable 'interface' from source: set_fact 42613 1727204591.09542: variable 'interface' from source: set_fact 42613 1727204591.09557: variable 'interface' from source: set_fact 42613 1727204591.09627: variable 'interface' from source: set_fact 42613 1727204591.09739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204591.09949: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204591.09998: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204591.10038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204591.10083: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204591.10137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204591.10168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204591.10210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.10242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204591.10315: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204591.10593: variable 'network_connections' from source: task vars 42613 1727204591.10604: variable 'interface' from source: set_fact 42613 1727204591.10679: variable 'interface' from source: set_fact 42613 1727204591.10690: variable 'interface' from source: set_fact 42613 1727204591.10756: variable 'interface' from source: set_fact 42613 1727204591.10824: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204591.10831: when evaluation is False, skipping this task 42613 1727204591.10838: _execute() done 42613 1727204591.10845: dumping result to json 42613 1727204591.10853: done dumping result, returning 42613 1727204591.10866: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-000000000022] 42613 1727204591.10887: sending task result for task 127b8e07-fff9-2f91-05d8-000000000022 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204591.11047: no more pending results, returning what we have 42613 1727204591.11051: results queue empty 42613 1727204591.11052: checking for any_errors_fatal 42613 1727204591.11060: done checking for any_errors_fatal 42613 1727204591.11060: checking for max_fail_percentage 42613 1727204591.11063: done checking for max_fail_percentage 42613 1727204591.11064: checking to see if all hosts have failed and the running result is not ok 42613 1727204591.11065: done checking to see if all hosts have failed 42613 1727204591.11067: getting the remaining hosts for this loop 42613 1727204591.11069: done getting the remaining hosts for this loop 42613 1727204591.11074: getting the next task for host managed-node3 42613 1727204591.11082: done getting next task for host managed-node3 42613 1727204591.11087: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 42613 1727204591.11090: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204591.11107: getting variables 42613 1727204591.11109: in VariableManager get_vars() 42613 1727204591.11151: Calling all_inventory to load vars for managed-node3 42613 1727204591.11154: Calling groups_inventory to load vars for managed-node3 42613 1727204591.11156: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204591.11171: Calling all_plugins_play to load vars for managed-node3 42613 1727204591.11175: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204591.11179: Calling groups_plugins_play to load vars for managed-node3 42613 1727204591.12177: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000022 42613 1727204591.12182: WORKER PROCESS EXITING 42613 1727204591.14779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204591.18424: done with get_vars() 42613 1727204591.18451: done getting variables 42613 1727204591.18521: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.197) 0:00:19.793 ***** 42613 1727204591.18557: entering _queue_task() for managed-node3/service 42613 1727204591.18929: worker is 1 (out of 1 available) 42613 1727204591.18945: exiting _queue_task() for managed-node3/service 42613 1727204591.18959: done queuing things up, now waiting for results queue to drain 42613 1727204591.18960: waiting for pending results... 42613 1727204591.19277: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 42613 1727204591.19454: in run() - task 127b8e07-fff9-2f91-05d8-000000000023 42613 1727204591.19481: variable 'ansible_search_path' from source: unknown 42613 1727204591.19490: variable 'ansible_search_path' from source: unknown 42613 1727204591.19541: calling self._execute() 42613 1727204591.19656: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204591.19900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204591.19904: variable 'omit' from source: magic vars 42613 1727204591.20637: variable 'ansible_distribution_major_version' from source: facts 42613 1727204591.20717: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204591.21017: variable 'network_provider' from source: set_fact 42613 1727204591.21190: variable 'network_state' from source: role '' defaults 42613 1727204591.21206: Evaluated conditional (network_provider == "nm" or network_state != {}): True 42613 1727204591.21251: variable 'omit' from source: magic vars 42613 1727204591.21398: variable 'omit' from source: magic vars 42613 1727204591.21448: variable 'network_service_name' from source: role '' defaults 42613 1727204591.21547: variable 'network_service_name' from source: role '' defaults 42613 1727204591.21679: variable '__network_provider_setup' from source: role '' defaults 42613 1727204591.21690: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204591.21759: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204591.21867: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204591.21873: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204591.22119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204591.25613: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204591.25717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204591.25771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204591.25820: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204591.25856: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204591.26012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.26060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.26096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.26154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.26177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.26268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.26299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.26363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.26385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.26403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.26759: variable '__network_packages_default_gobject_packages' from source: role '' defaults 42613 1727204591.26911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.26946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.27020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.27030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.27051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.27167: variable 'ansible_python' from source: facts 42613 1727204591.27197: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 42613 1727204591.27302: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204591.27401: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204591.27563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.27672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.27676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.27679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.27696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.27755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204591.27802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204591.27830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.27882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204591.27970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204591.28079: variable 'network_connections' from source: task vars 42613 1727204591.28093: variable 'interface' from source: set_fact 42613 1727204591.28368: variable 'interface' from source: set_fact 42613 1727204591.28423: variable 'interface' from source: set_fact 42613 1727204591.28524: variable 'interface' from source: set_fact 42613 1727204591.29174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204591.29638: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204591.29779: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204591.29842: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204591.30017: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204591.30184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204591.30393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204591.30419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204591.30493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204591.30774: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204591.31607: variable 'network_connections' from source: task vars 42613 1727204591.31630: variable 'interface' from source: set_fact 42613 1727204591.32023: variable 'interface' from source: set_fact 42613 1727204591.32026: variable 'interface' from source: set_fact 42613 1727204591.32376: variable 'interface' from source: set_fact 42613 1727204591.32638: variable '__network_packages_default_wireless' from source: role '' defaults 42613 1727204591.32869: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204591.33670: variable 'network_connections' from source: task vars 42613 1727204591.33747: variable 'interface' from source: set_fact 42613 1727204591.33877: variable 'interface' from source: set_fact 42613 1727204591.33889: variable 'interface' from source: set_fact 42613 1727204591.33984: variable 'interface' from source: set_fact 42613 1727204591.34092: variable '__network_packages_default_team' from source: role '' defaults 42613 1727204591.34297: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204591.34903: variable 'network_connections' from source: task vars 42613 1727204591.34908: variable 'interface' from source: set_fact 42613 1727204591.34998: variable 'interface' from source: set_fact 42613 1727204591.35007: variable 'interface' from source: set_fact 42613 1727204591.35089: variable 'interface' from source: set_fact 42613 1727204591.35195: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204591.35259: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204591.35270: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204591.35343: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204591.35592: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 42613 1727204591.36276: variable 'network_connections' from source: task vars 42613 1727204591.36280: variable 'interface' from source: set_fact 42613 1727204591.36291: variable 'interface' from source: set_fact 42613 1727204591.36300: variable 'interface' from source: set_fact 42613 1727204591.36362: variable 'interface' from source: set_fact 42613 1727204591.36472: variable 'ansible_distribution' from source: facts 42613 1727204591.36475: variable '__network_rh_distros' from source: role '' defaults 42613 1727204591.36479: variable 'ansible_distribution_major_version' from source: facts 42613 1727204591.36482: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 42613 1727204591.37081: variable 'ansible_distribution' from source: facts 42613 1727204591.37089: variable '__network_rh_distros' from source: role '' defaults 42613 1727204591.37092: variable 'ansible_distribution_major_version' from source: facts 42613 1727204591.37109: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 42613 1727204591.37580: variable 'ansible_distribution' from source: facts 42613 1727204591.37595: variable '__network_rh_distros' from source: role '' defaults 42613 1727204591.37603: variable 'ansible_distribution_major_version' from source: facts 42613 1727204591.37644: variable 'network_provider' from source: set_fact 42613 1727204591.37703: variable 'omit' from source: magic vars 42613 1727204591.37773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204591.37777: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204591.37805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204591.37990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204591.37994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204591.38030: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204591.38036: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204591.38039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204591.38389: Set connection var ansible_shell_executable to /bin/sh 42613 1727204591.38401: Set connection var ansible_pipelining to False 42613 1727204591.38404: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204591.38407: Set connection var ansible_connection to ssh 42613 1727204591.38413: Set connection var ansible_timeout to 10 42613 1727204591.38415: Set connection var ansible_shell_type to sh 42613 1727204591.38448: variable 'ansible_shell_executable' from source: unknown 42613 1727204591.38451: variable 'ansible_connection' from source: unknown 42613 1727204591.38455: variable 'ansible_module_compression' from source: unknown 42613 1727204591.38457: variable 'ansible_shell_type' from source: unknown 42613 1727204591.38620: variable 'ansible_shell_executable' from source: unknown 42613 1727204591.38623: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204591.38631: variable 'ansible_pipelining' from source: unknown 42613 1727204591.38635: variable 'ansible_timeout' from source: unknown 42613 1727204591.38638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204591.38807: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204591.38840: variable 'omit' from source: magic vars 42613 1727204591.38843: starting attempt loop 42613 1727204591.38846: running the handler 42613 1727204591.38972: variable 'ansible_facts' from source: unknown 42613 1727204591.39826: _low_level_execute_command(): starting 42613 1727204591.39830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204591.40353: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204591.40391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204591.40396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.40399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204591.40402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.40447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204591.40450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204591.40461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204591.40546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204591.42374: stdout chunk (state=3): >>>/root <<< 42613 1727204591.42477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204591.42550: stderr chunk (state=3): >>><<< 42613 1727204591.42552: stdout chunk (state=3): >>><<< 42613 1727204591.42572: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204591.42581: _low_level_execute_command(): starting 42613 1727204591.42587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884 `" && echo ansible-tmp-1727204591.4256957-44369-50187209476884="` echo /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884 `" ) && sleep 0' 42613 1727204591.43072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204591.43089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.43093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.43138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204591.43152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204591.43236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204591.45379: stdout chunk (state=3): >>>ansible-tmp-1727204591.4256957-44369-50187209476884=/root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884 <<< 42613 1727204591.45627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204591.45632: stdout chunk (state=3): >>><<< 42613 1727204591.45634: stderr chunk (state=3): >>><<< 42613 1727204591.45654: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204591.4256957-44369-50187209476884=/root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204591.45796: variable 'ansible_module_compression' from source: unknown 42613 1727204591.45806: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 42613 1727204591.45815: ANSIBALLZ: Acquiring lock 42613 1727204591.45817: ANSIBALLZ: Lock acquired: 139982757271872 42613 1727204591.45820: ANSIBALLZ: Creating module 42613 1727204591.68038: ANSIBALLZ: Writing module into payload 42613 1727204591.68162: ANSIBALLZ: Writing module 42613 1727204591.68194: ANSIBALLZ: Renaming module 42613 1727204591.68199: ANSIBALLZ: Done creating module 42613 1727204591.68232: variable 'ansible_facts' from source: unknown 42613 1727204591.68381: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/AnsiballZ_systemd.py 42613 1727204591.68510: Sending initial data 42613 1727204591.68514: Sent initial data (155 bytes) 42613 1727204591.69029: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204591.69034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204591.69037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.69092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204591.69095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204591.69098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204591.69180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204591.70962: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204591.71028: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204591.71099: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpcmy2kkog /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/AnsiballZ_systemd.py <<< 42613 1727204591.71103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/AnsiballZ_systemd.py" <<< 42613 1727204591.71163: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpcmy2kkog" to remote "/root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/AnsiballZ_systemd.py" <<< 42613 1727204591.71168: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/AnsiballZ_systemd.py" <<< 42613 1727204591.72413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204591.72492: stderr chunk (state=3): >>><<< 42613 1727204591.72496: stdout chunk (state=3): >>><<< 42613 1727204591.72520: done transferring module to remote 42613 1727204591.72531: _low_level_execute_command(): starting 42613 1727204591.72539: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/ /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/AnsiballZ_systemd.py && sleep 0' 42613 1727204591.73059: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204591.73063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204591.73069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.73119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204591.73125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204591.73130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204591.73198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204591.75232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204591.75289: stderr chunk (state=3): >>><<< 42613 1727204591.75293: stdout chunk (state=3): >>><<< 42613 1727204591.75307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204591.75310: _low_level_execute_command(): starting 42613 1727204591.75318: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/AnsiballZ_systemd.py && sleep 0' 42613 1727204591.75822: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204591.75826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.75837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204591.75839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204591.75885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204591.75888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204591.75980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204592.09502: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11833344", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3515854848", "CPUUsageNSec": "3182746000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 42613 1727204592.09516: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.service multi-user.target network.target", "After": "dbus-broker.service cloud-init-local.service systemd-journald.socket system.slice dbus.socket sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": <<< 42613 1727204592.09538: stdout chunk (state=3): >>>"system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:00:56 EDT", "StateChangeTimestampMonotonic": "794185509", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 42613 1727204592.11776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204592.11837: stderr chunk (state=3): >>><<< 42613 1727204592.11841: stdout chunk (state=3): >>><<< 42613 1727204592.11861: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11833344", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3515854848", "CPUUsageNSec": "3182746000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.service multi-user.target network.target", "After": "dbus-broker.service cloud-init-local.service systemd-journald.socket system.slice dbus.socket sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:00:56 EDT", "StateChangeTimestampMonotonic": "794185509", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204592.12000: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204592.12024: _low_level_execute_command(): starting 42613 1727204592.12027: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204591.4256957-44369-50187209476884/ > /dev/null 2>&1 && sleep 0' 42613 1727204592.12527: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204592.12531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204592.12537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204592.12539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204592.12600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204592.12607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204592.12610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204592.12678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204592.14699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204592.14764: stderr chunk (state=3): >>><<< 42613 1727204592.14770: stdout chunk (state=3): >>><<< 42613 1727204592.14781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204592.14789: handler run complete 42613 1727204592.14841: attempt loop complete, returning result 42613 1727204592.14845: _execute() done 42613 1727204592.14847: dumping result to json 42613 1727204592.14858: done dumping result, returning 42613 1727204592.14869: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-2f91-05d8-000000000023] 42613 1727204592.14874: sending task result for task 127b8e07-fff9-2f91-05d8-000000000023 42613 1727204592.15146: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000023 42613 1727204592.15149: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204592.15209: no more pending results, returning what we have 42613 1727204592.15213: results queue empty 42613 1727204592.15214: checking for any_errors_fatal 42613 1727204592.15224: done checking for any_errors_fatal 42613 1727204592.15225: checking for max_fail_percentage 42613 1727204592.15226: done checking for max_fail_percentage 42613 1727204592.15227: checking to see if all hosts have failed and the running result is not ok 42613 1727204592.15228: done checking to see if all hosts have failed 42613 1727204592.15229: getting the remaining hosts for this loop 42613 1727204592.15230: done getting the remaining hosts for this loop 42613 1727204592.15237: getting the next task for host managed-node3 42613 1727204592.15243: done getting next task for host managed-node3 42613 1727204592.15247: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 42613 1727204592.15250: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204592.15268: getting variables 42613 1727204592.15271: in VariableManager get_vars() 42613 1727204592.15305: Calling all_inventory to load vars for managed-node3 42613 1727204592.15308: Calling groups_inventory to load vars for managed-node3 42613 1727204592.15310: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204592.15320: Calling all_plugins_play to load vars for managed-node3 42613 1727204592.15323: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204592.15326: Calling groups_plugins_play to load vars for managed-node3 42613 1727204592.16355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204592.17643: done with get_vars() 42613 1727204592.17674: done getting variables 42613 1727204592.17726: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.991) 0:00:20.785 ***** 42613 1727204592.17757: entering _queue_task() for managed-node3/service 42613 1727204592.18045: worker is 1 (out of 1 available) 42613 1727204592.18062: exiting _queue_task() for managed-node3/service 42613 1727204592.18076: done queuing things up, now waiting for results queue to drain 42613 1727204592.18078: waiting for pending results... 42613 1727204592.18274: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 42613 1727204592.18384: in run() - task 127b8e07-fff9-2f91-05d8-000000000024 42613 1727204592.18396: variable 'ansible_search_path' from source: unknown 42613 1727204592.18400: variable 'ansible_search_path' from source: unknown 42613 1727204592.18439: calling self._execute() 42613 1727204592.18514: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204592.18520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204592.18531: variable 'omit' from source: magic vars 42613 1727204592.18838: variable 'ansible_distribution_major_version' from source: facts 42613 1727204592.18847: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204592.18940: variable 'network_provider' from source: set_fact 42613 1727204592.18944: Evaluated conditional (network_provider == "nm"): True 42613 1727204592.19015: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204592.19086: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204592.19217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204592.21796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204592.21853: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204592.21884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204592.21911: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204592.21937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204592.22009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204592.22034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204592.22058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204592.22090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204592.22101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204592.22141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204592.22168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204592.22187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204592.22215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204592.22226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204592.22264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204592.22283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204592.22303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204592.22329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204592.22341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204592.22516: variable 'network_connections' from source: task vars 42613 1727204592.22518: variable 'interface' from source: set_fact 42613 1727204592.22592: variable 'interface' from source: set_fact 42613 1727204592.22608: variable 'interface' from source: set_fact 42613 1727204592.22677: variable 'interface' from source: set_fact 42613 1727204592.22792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204592.23007: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204592.23075: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204592.23133: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204592.23170: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204592.23274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204592.23277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204592.23280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204592.23298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204592.23354: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204592.23974: variable 'network_connections' from source: task vars 42613 1727204592.23977: variable 'interface' from source: set_fact 42613 1727204592.23980: variable 'interface' from source: set_fact 42613 1727204592.23982: variable 'interface' from source: set_fact 42613 1727204592.24172: variable 'interface' from source: set_fact 42613 1727204592.24379: Evaluated conditional (__network_wpa_supplicant_required): False 42613 1727204592.24383: when evaluation is False, skipping this task 42613 1727204592.24385: _execute() done 42613 1727204592.24397: dumping result to json 42613 1727204592.24399: done dumping result, returning 42613 1727204592.24402: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-2f91-05d8-000000000024] 42613 1727204592.24404: sending task result for task 127b8e07-fff9-2f91-05d8-000000000024 42613 1727204592.24486: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000024 42613 1727204592.24489: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 42613 1727204592.24542: no more pending results, returning what we have 42613 1727204592.24547: results queue empty 42613 1727204592.24548: checking for any_errors_fatal 42613 1727204592.24662: done checking for any_errors_fatal 42613 1727204592.24664: checking for max_fail_percentage 42613 1727204592.24669: done checking for max_fail_percentage 42613 1727204592.24670: checking to see if all hosts have failed and the running result is not ok 42613 1727204592.24671: done checking to see if all hosts have failed 42613 1727204592.24672: getting the remaining hosts for this loop 42613 1727204592.24674: done getting the remaining hosts for this loop 42613 1727204592.24679: getting the next task for host managed-node3 42613 1727204592.24689: done getting next task for host managed-node3 42613 1727204592.24694: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 42613 1727204592.24703: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204592.24721: getting variables 42613 1727204592.24723: in VariableManager get_vars() 42613 1727204592.24885: Calling all_inventory to load vars for managed-node3 42613 1727204592.24888: Calling groups_inventory to load vars for managed-node3 42613 1727204592.24890: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204592.24900: Calling all_plugins_play to load vars for managed-node3 42613 1727204592.24903: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204592.24906: Calling groups_plugins_play to load vars for managed-node3 42613 1727204592.26800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204592.29110: done with get_vars() 42613 1727204592.29148: done getting variables 42613 1727204592.29217: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.114) 0:00:20.900 ***** 42613 1727204592.29253: entering _queue_task() for managed-node3/service 42613 1727204592.29627: worker is 1 (out of 1 available) 42613 1727204592.29641: exiting _queue_task() for managed-node3/service 42613 1727204592.29654: done queuing things up, now waiting for results queue to drain 42613 1727204592.29655: waiting for pending results... 42613 1727204592.29973: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 42613 1727204592.30138: in run() - task 127b8e07-fff9-2f91-05d8-000000000025 42613 1727204592.30161: variable 'ansible_search_path' from source: unknown 42613 1727204592.30271: variable 'ansible_search_path' from source: unknown 42613 1727204592.30277: calling self._execute() 42613 1727204592.30331: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204592.30343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204592.30358: variable 'omit' from source: magic vars 42613 1727204592.30774: variable 'ansible_distribution_major_version' from source: facts 42613 1727204592.30794: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204592.30937: variable 'network_provider' from source: set_fact 42613 1727204592.30950: Evaluated conditional (network_provider == "initscripts"): False 42613 1727204592.30958: when evaluation is False, skipping this task 42613 1727204592.30967: _execute() done 42613 1727204592.30975: dumping result to json 42613 1727204592.30982: done dumping result, returning 42613 1727204592.30993: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-2f91-05d8-000000000025] 42613 1727204592.31045: sending task result for task 127b8e07-fff9-2f91-05d8-000000000025 42613 1727204592.31123: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000025 42613 1727204592.31126: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204592.31199: no more pending results, returning what we have 42613 1727204592.31203: results queue empty 42613 1727204592.31205: checking for any_errors_fatal 42613 1727204592.31216: done checking for any_errors_fatal 42613 1727204592.31217: checking for max_fail_percentage 42613 1727204592.31220: done checking for max_fail_percentage 42613 1727204592.31221: checking to see if all hosts have failed and the running result is not ok 42613 1727204592.31222: done checking to see if all hosts have failed 42613 1727204592.31222: getting the remaining hosts for this loop 42613 1727204592.31224: done getting the remaining hosts for this loop 42613 1727204592.31228: getting the next task for host managed-node3 42613 1727204592.31236: done getting next task for host managed-node3 42613 1727204592.31243: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 42613 1727204592.31247: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204592.31469: getting variables 42613 1727204592.31471: in VariableManager get_vars() 42613 1727204592.31510: Calling all_inventory to load vars for managed-node3 42613 1727204592.31514: Calling groups_inventory to load vars for managed-node3 42613 1727204592.31516: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204592.31527: Calling all_plugins_play to load vars for managed-node3 42613 1727204592.31530: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204592.31533: Calling groups_plugins_play to load vars for managed-node3 42613 1727204592.33541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204592.35880: done with get_vars() 42613 1727204592.35921: done getting variables 42613 1727204592.35990: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.067) 0:00:20.968 ***** 42613 1727204592.36029: entering _queue_task() for managed-node3/copy 42613 1727204592.36825: worker is 1 (out of 1 available) 42613 1727204592.36843: exiting _queue_task() for managed-node3/copy 42613 1727204592.36856: done queuing things up, now waiting for results queue to drain 42613 1727204592.36858: waiting for pending results... 42613 1727204592.37631: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 42613 1727204592.37876: in run() - task 127b8e07-fff9-2f91-05d8-000000000026 42613 1727204592.37881: variable 'ansible_search_path' from source: unknown 42613 1727204592.37884: variable 'ansible_search_path' from source: unknown 42613 1727204592.38054: calling self._execute() 42613 1727204592.38328: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204592.38332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204592.38337: variable 'omit' from source: magic vars 42613 1727204592.39477: variable 'ansible_distribution_major_version' from source: facts 42613 1727204592.39482: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204592.39894: variable 'network_provider' from source: set_fact 42613 1727204592.39899: Evaluated conditional (network_provider == "initscripts"): False 42613 1727204592.39902: when evaluation is False, skipping this task 42613 1727204592.39990: _execute() done 42613 1727204592.39994: dumping result to json 42613 1727204592.39997: done dumping result, returning 42613 1727204592.40001: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-2f91-05d8-000000000026] 42613 1727204592.40004: sending task result for task 127b8e07-fff9-2f91-05d8-000000000026 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 42613 1727204592.40154: no more pending results, returning what we have 42613 1727204592.40159: results queue empty 42613 1727204592.40160: checking for any_errors_fatal 42613 1727204592.40169: done checking for any_errors_fatal 42613 1727204592.40170: checking for max_fail_percentage 42613 1727204592.40174: done checking for max_fail_percentage 42613 1727204592.40175: checking to see if all hosts have failed and the running result is not ok 42613 1727204592.40176: done checking to see if all hosts have failed 42613 1727204592.40176: getting the remaining hosts for this loop 42613 1727204592.40178: done getting the remaining hosts for this loop 42613 1727204592.40184: getting the next task for host managed-node3 42613 1727204592.40192: done getting next task for host managed-node3 42613 1727204592.40197: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 42613 1727204592.40200: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204592.40221: getting variables 42613 1727204592.40223: in VariableManager get_vars() 42613 1727204592.40881: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000026 42613 1727204592.40886: WORKER PROCESS EXITING 42613 1727204592.40888: Calling all_inventory to load vars for managed-node3 42613 1727204592.40893: Calling groups_inventory to load vars for managed-node3 42613 1727204592.40896: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204592.40909: Calling all_plugins_play to load vars for managed-node3 42613 1727204592.40913: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204592.40916: Calling groups_plugins_play to load vars for managed-node3 42613 1727204592.43025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204592.48464: done with get_vars() 42613 1727204592.48713: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.127) 0:00:21.096 ***** 42613 1727204592.48819: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 42613 1727204592.48821: Creating lock for fedora.linux_system_roles.network_connections 42613 1727204592.49649: worker is 1 (out of 1 available) 42613 1727204592.50069: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 42613 1727204592.50083: done queuing things up, now waiting for results queue to drain 42613 1727204592.50084: waiting for pending results... 42613 1727204592.50490: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 42613 1727204592.50641: in run() - task 127b8e07-fff9-2f91-05d8-000000000027 42613 1727204592.50665: variable 'ansible_search_path' from source: unknown 42613 1727204592.51275: variable 'ansible_search_path' from source: unknown 42613 1727204592.51280: calling self._execute() 42613 1727204592.51284: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204592.51287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204592.51290: variable 'omit' from source: magic vars 42613 1727204592.52474: variable 'ansible_distribution_major_version' from source: facts 42613 1727204592.52479: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204592.52482: variable 'omit' from source: magic vars 42613 1727204592.52484: variable 'omit' from source: magic vars 42613 1727204592.52651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204592.56775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204592.57057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204592.57109: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204592.57153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204592.57302: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204592.57400: variable 'network_provider' from source: set_fact 42613 1727204592.57627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204592.57806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204592.57902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204592.57955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204592.58161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204592.58262: variable 'omit' from source: magic vars 42613 1727204592.58403: variable 'omit' from source: magic vars 42613 1727204592.58526: variable 'network_connections' from source: task vars 42613 1727204592.58552: variable 'interface' from source: set_fact 42613 1727204592.58627: variable 'interface' from source: set_fact 42613 1727204592.58649: variable 'interface' from source: set_fact 42613 1727204592.58717: variable 'interface' from source: set_fact 42613 1727204592.59152: variable 'omit' from source: magic vars 42613 1727204592.59168: variable '__lsr_ansible_managed' from source: task vars 42613 1727204592.59242: variable '__lsr_ansible_managed' from source: task vars 42613 1727204592.59563: Loaded config def from plugin (lookup/template) 42613 1727204592.59576: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 42613 1727204592.59619: File lookup term: get_ansible_managed.j2 42613 1727204592.59629: variable 'ansible_search_path' from source: unknown 42613 1727204592.59641: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 42613 1727204592.59662: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 42613 1727204592.59690: variable 'ansible_search_path' from source: unknown 42613 1727204592.67769: variable 'ansible_managed' from source: unknown 42613 1727204592.67963: variable 'omit' from source: magic vars 42613 1727204592.68005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204592.68047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204592.68072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204592.68100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204592.68116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204592.68153: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204592.68207: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204592.68210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204592.68293: Set connection var ansible_shell_executable to /bin/sh 42613 1727204592.68303: Set connection var ansible_pipelining to False 42613 1727204592.68320: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204592.68326: Set connection var ansible_connection to ssh 42613 1727204592.68338: Set connection var ansible_timeout to 10 42613 1727204592.68345: Set connection var ansible_shell_type to sh 42613 1727204592.68377: variable 'ansible_shell_executable' from source: unknown 42613 1727204592.68423: variable 'ansible_connection' from source: unknown 42613 1727204592.68427: variable 'ansible_module_compression' from source: unknown 42613 1727204592.68429: variable 'ansible_shell_type' from source: unknown 42613 1727204592.68432: variable 'ansible_shell_executable' from source: unknown 42613 1727204592.68437: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204592.68439: variable 'ansible_pipelining' from source: unknown 42613 1727204592.68442: variable 'ansible_timeout' from source: unknown 42613 1727204592.68444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204592.68589: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204592.68643: variable 'omit' from source: magic vars 42613 1727204592.68646: starting attempt loop 42613 1727204592.68649: running the handler 42613 1727204592.68660: _low_level_execute_command(): starting 42613 1727204592.68674: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204592.69569: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204592.69590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204592.69793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204592.69972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204592.71758: stdout chunk (state=3): >>>/root <<< 42613 1727204592.72219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204592.72223: stdout chunk (state=3): >>><<< 42613 1727204592.72226: stderr chunk (state=3): >>><<< 42613 1727204592.72229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204592.72233: _low_level_execute_command(): starting 42613 1727204592.72240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090 `" && echo ansible-tmp-1727204592.7211893-44410-215301686553090="` echo /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090 `" ) && sleep 0' 42613 1727204592.73597: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204592.73601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204592.73604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204592.73606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204592.73881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204592.73999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204592.74070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204592.76230: stdout chunk (state=3): >>>ansible-tmp-1727204592.7211893-44410-215301686553090=/root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090 <<< 42613 1727204592.76384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204592.76508: stderr chunk (state=3): >>><<< 42613 1727204592.76511: stdout chunk (state=3): >>><<< 42613 1727204592.76540: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204592.7211893-44410-215301686553090=/root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204592.76654: variable 'ansible_module_compression' from source: unknown 42613 1727204592.76712: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 42613 1727204592.76716: ANSIBALLZ: Acquiring lock 42613 1727204592.76719: ANSIBALLZ: Lock acquired: 139982752444112 42613 1727204592.76721: ANSIBALLZ: Creating module 42613 1727204593.14326: ANSIBALLZ: Writing module into payload 42613 1727204593.15123: ANSIBALLZ: Writing module 42613 1727204593.15269: ANSIBALLZ: Renaming module 42613 1727204593.15273: ANSIBALLZ: Done creating module 42613 1727204593.15307: variable 'ansible_facts' from source: unknown 42613 1727204593.15645: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/AnsiballZ_network_connections.py 42613 1727204593.16171: Sending initial data 42613 1727204593.16175: Sent initial data (168 bytes) 42613 1727204593.17390: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204593.17673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204593.17888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204593.18033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204593.19836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204593.19968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204593.20037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpzmveb19t /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/AnsiballZ_network_connections.py <<< 42613 1727204593.20052: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/AnsiballZ_network_connections.py" <<< 42613 1727204593.20174: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpzmveb19t" to remote "/root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/AnsiballZ_network_connections.py" <<< 42613 1727204593.20284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/AnsiballZ_network_connections.py" <<< 42613 1727204593.22375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204593.22379: stderr chunk (state=3): >>><<< 42613 1727204593.22382: stdout chunk (state=3): >>><<< 42613 1727204593.22384: done transferring module to remote 42613 1727204593.22386: _low_level_execute_command(): starting 42613 1727204593.22388: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/ /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/AnsiballZ_network_connections.py && sleep 0' 42613 1727204593.23889: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204593.24032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204593.24052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204593.24221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204593.26194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204593.26370: stderr chunk (state=3): >>><<< 42613 1727204593.26381: stdout chunk (state=3): >>><<< 42613 1727204593.26404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204593.26414: _low_level_execute_command(): starting 42613 1727204593.26425: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/AnsiballZ_network_connections.py && sleep 0' 42613 1727204593.27107: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204593.27125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204593.27183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204593.27255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204593.27275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204593.27296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204593.27427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204593.78461: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 42613 1727204593.80890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204593.80955: stderr chunk (state=3): >>><<< 42613 1727204593.80959: stdout chunk (state=3): >>><<< 42613 1727204593.80979: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204593.81085: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26', '2001:db8::2/32'], 'route': [{'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '2001:db8::4', 'prefix': 32, 'gateway': '2001:db8::1', 'metric': 2, 'table': 30600}], 'routing_rule': [{'priority': 30200, 'from': '198.51.100.58/26', 'table': 30200}, {'priority': 30201, 'family': 'ipv4', 'fwmark': 1, 'fwmask': 1, 'table': 30200}, {'priority': 30202, 'family': 'ipv4', 'ipproto': 6, 'table': 30200}, {'priority': 30203, 'family': 'ipv4', 'sport': '128 - 256', 'table': 30200}, {'priority': 30204, 'family': 'ipv4', 'tos': 8, 'table': 30200}, {'priority': 30400, 'to': '198.51.100.128/26', 'table': 30400}, {'priority': 30401, 'family': 'ipv4', 'iif': 'iiftest', 'table': 30400}, {'priority': 30402, 'family': 'ipv4', 'oif': 'oiftest', 'table': 30400}, {'priority': 30403, 'from': '0.0.0.0/0', 'to': '0.0.0.0/0', 'table': 30400}, {'priority': 30600, 'to': '2001:db8::4/32', 'table': 30600}, {'priority': 30601, 'family': 'ipv6', 'dport': '128 - 256', 'invert': True, 'table': 30600}, {'priority': 30602, 'from': '::/0', 'to': '::/0', 'table': 30600}, {'priority': 200, 'from': '198.51.100.56/26', 'table': 'custom'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204593.81093: _low_level_execute_command(): starting 42613 1727204593.81098: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204592.7211893-44410-215301686553090/ > /dev/null 2>&1 && sleep 0' 42613 1727204593.81610: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204593.81615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204593.81624: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204593.81679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204593.81685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204593.81699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204593.81760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204593.83830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204593.83896: stderr chunk (state=3): >>><<< 42613 1727204593.83900: stdout chunk (state=3): >>><<< 42613 1727204593.83914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204593.83921: handler run complete 42613 1727204593.84013: attempt loop complete, returning result 42613 1727204593.84017: _execute() done 42613 1727204593.84020: dumping result to json 42613 1727204593.84031: done dumping result, returning 42613 1727204593.84042: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-2f91-05d8-000000000027] 42613 1727204593.84047: sending task result for task 127b8e07-fff9-2f91-05d8-000000000027 42613 1727204593.84211: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000027 42613 1727204593.84214: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29 (not-active) 42613 1727204593.84512: no more pending results, returning what we have 42613 1727204593.84515: results queue empty 42613 1727204593.84516: checking for any_errors_fatal 42613 1727204593.84523: done checking for any_errors_fatal 42613 1727204593.84524: checking for max_fail_percentage 42613 1727204593.84526: done checking for max_fail_percentage 42613 1727204593.84526: checking to see if all hosts have failed and the running result is not ok 42613 1727204593.84527: done checking to see if all hosts have failed 42613 1727204593.84528: getting the remaining hosts for this loop 42613 1727204593.84529: done getting the remaining hosts for this loop 42613 1727204593.84533: getting the next task for host managed-node3 42613 1727204593.84539: done getting next task for host managed-node3 42613 1727204593.84542: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 42613 1727204593.84545: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204593.84555: getting variables 42613 1727204593.84557: in VariableManager get_vars() 42613 1727204593.84601: Calling all_inventory to load vars for managed-node3 42613 1727204593.84604: Calling groups_inventory to load vars for managed-node3 42613 1727204593.84606: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204593.84616: Calling all_plugins_play to load vars for managed-node3 42613 1727204593.84618: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204593.84621: Calling groups_plugins_play to load vars for managed-node3 42613 1727204593.85731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204593.86925: done with get_vars() 42613 1727204593.86953: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:13 -0400 (0:00:01.382) 0:00:22.478 ***** 42613 1727204593.87029: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 42613 1727204593.87030: Creating lock for fedora.linux_system_roles.network_state 42613 1727204593.87323: worker is 1 (out of 1 available) 42613 1727204593.87340: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 42613 1727204593.87353: done queuing things up, now waiting for results queue to drain 42613 1727204593.87354: waiting for pending results... 42613 1727204593.87550: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 42613 1727204593.87670: in run() - task 127b8e07-fff9-2f91-05d8-000000000028 42613 1727204593.87682: variable 'ansible_search_path' from source: unknown 42613 1727204593.87687: variable 'ansible_search_path' from source: unknown 42613 1727204593.87722: calling self._execute() 42613 1727204593.87807: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204593.87813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204593.87818: variable 'omit' from source: magic vars 42613 1727204593.88136: variable 'ansible_distribution_major_version' from source: facts 42613 1727204593.88147: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204593.88244: variable 'network_state' from source: role '' defaults 42613 1727204593.88251: Evaluated conditional (network_state != {}): False 42613 1727204593.88254: when evaluation is False, skipping this task 42613 1727204593.88259: _execute() done 42613 1727204593.88262: dumping result to json 42613 1727204593.88264: done dumping result, returning 42613 1727204593.88274: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-2f91-05d8-000000000028] 42613 1727204593.88279: sending task result for task 127b8e07-fff9-2f91-05d8-000000000028 42613 1727204593.88385: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000028 42613 1727204593.88388: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204593.88444: no more pending results, returning what we have 42613 1727204593.88449: results queue empty 42613 1727204593.88450: checking for any_errors_fatal 42613 1727204593.88477: done checking for any_errors_fatal 42613 1727204593.88478: checking for max_fail_percentage 42613 1727204593.88482: done checking for max_fail_percentage 42613 1727204593.88482: checking to see if all hosts have failed and the running result is not ok 42613 1727204593.88483: done checking to see if all hosts have failed 42613 1727204593.88484: getting the remaining hosts for this loop 42613 1727204593.88485: done getting the remaining hosts for this loop 42613 1727204593.88490: getting the next task for host managed-node3 42613 1727204593.88498: done getting next task for host managed-node3 42613 1727204593.88502: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 42613 1727204593.88505: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204593.88521: getting variables 42613 1727204593.88523: in VariableManager get_vars() 42613 1727204593.88558: Calling all_inventory to load vars for managed-node3 42613 1727204593.88561: Calling groups_inventory to load vars for managed-node3 42613 1727204593.88563: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204593.88581: Calling all_plugins_play to load vars for managed-node3 42613 1727204593.88584: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204593.88587: Calling groups_plugins_play to load vars for managed-node3 42613 1727204593.89677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204593.90884: done with get_vars() 42613 1727204593.90913: done getting variables 42613 1727204593.90970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.039) 0:00:22.518 ***** 42613 1727204593.90998: entering _queue_task() for managed-node3/debug 42613 1727204593.91281: worker is 1 (out of 1 available) 42613 1727204593.91297: exiting _queue_task() for managed-node3/debug 42613 1727204593.91311: done queuing things up, now waiting for results queue to drain 42613 1727204593.91312: waiting for pending results... 42613 1727204593.91514: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 42613 1727204593.91621: in run() - task 127b8e07-fff9-2f91-05d8-000000000029 42613 1727204593.91633: variable 'ansible_search_path' from source: unknown 42613 1727204593.91637: variable 'ansible_search_path' from source: unknown 42613 1727204593.91677: calling self._execute() 42613 1727204593.91760: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204593.91767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204593.91779: variable 'omit' from source: magic vars 42613 1727204593.92090: variable 'ansible_distribution_major_version' from source: facts 42613 1727204593.92101: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204593.92108: variable 'omit' from source: magic vars 42613 1727204593.92161: variable 'omit' from source: magic vars 42613 1727204593.92192: variable 'omit' from source: magic vars 42613 1727204593.92232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204593.92264: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204593.92283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204593.92299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204593.92310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204593.92337: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204593.92342: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204593.92345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204593.92424: Set connection var ansible_shell_executable to /bin/sh 42613 1727204593.92439: Set connection var ansible_pipelining to False 42613 1727204593.92445: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204593.92448: Set connection var ansible_connection to ssh 42613 1727204593.92453: Set connection var ansible_timeout to 10 42613 1727204593.92456: Set connection var ansible_shell_type to sh 42613 1727204593.92477: variable 'ansible_shell_executable' from source: unknown 42613 1727204593.92480: variable 'ansible_connection' from source: unknown 42613 1727204593.92483: variable 'ansible_module_compression' from source: unknown 42613 1727204593.92486: variable 'ansible_shell_type' from source: unknown 42613 1727204593.92488: variable 'ansible_shell_executable' from source: unknown 42613 1727204593.92490: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204593.92496: variable 'ansible_pipelining' from source: unknown 42613 1727204593.92499: variable 'ansible_timeout' from source: unknown 42613 1727204593.92502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204593.92626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204593.92635: variable 'omit' from source: magic vars 42613 1727204593.92642: starting attempt loop 42613 1727204593.92645: running the handler 42613 1727204593.92757: variable '__network_connections_result' from source: set_fact 42613 1727204593.92818: handler run complete 42613 1727204593.92834: attempt loop complete, returning result 42613 1727204593.92841: _execute() done 42613 1727204593.92844: dumping result to json 42613 1727204593.92847: done dumping result, returning 42613 1727204593.92853: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-2f91-05d8-000000000029] 42613 1727204593.92860: sending task result for task 127b8e07-fff9-2f91-05d8-000000000029 42613 1727204593.92960: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000029 42613 1727204593.92963: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29 (not-active)" ] } 42613 1727204593.93029: no more pending results, returning what we have 42613 1727204593.93033: results queue empty 42613 1727204593.93034: checking for any_errors_fatal 42613 1727204593.93043: done checking for any_errors_fatal 42613 1727204593.93044: checking for max_fail_percentage 42613 1727204593.93046: done checking for max_fail_percentage 42613 1727204593.93047: checking to see if all hosts have failed and the running result is not ok 42613 1727204593.93048: done checking to see if all hosts have failed 42613 1727204593.93048: getting the remaining hosts for this loop 42613 1727204593.93050: done getting the remaining hosts for this loop 42613 1727204593.93054: getting the next task for host managed-node3 42613 1727204593.93061: done getting next task for host managed-node3 42613 1727204593.93073: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 42613 1727204593.93076: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204593.93089: getting variables 42613 1727204593.93091: in VariableManager get_vars() 42613 1727204593.93128: Calling all_inventory to load vars for managed-node3 42613 1727204593.93131: Calling groups_inventory to load vars for managed-node3 42613 1727204593.93133: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204593.93145: Calling all_plugins_play to load vars for managed-node3 42613 1727204593.93147: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204593.93150: Calling groups_plugins_play to load vars for managed-node3 42613 1727204593.94180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204593.95426: done with get_vars() 42613 1727204593.95458: done getting variables 42613 1727204593.95512: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.045) 0:00:22.563 ***** 42613 1727204593.95542: entering _queue_task() for managed-node3/debug 42613 1727204593.95841: worker is 1 (out of 1 available) 42613 1727204593.95858: exiting _queue_task() for managed-node3/debug 42613 1727204593.95872: done queuing things up, now waiting for results queue to drain 42613 1727204593.95873: waiting for pending results... 42613 1727204593.96071: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 42613 1727204593.96373: in run() - task 127b8e07-fff9-2f91-05d8-00000000002a 42613 1727204593.96376: variable 'ansible_search_path' from source: unknown 42613 1727204593.96380: variable 'ansible_search_path' from source: unknown 42613 1727204593.96383: calling self._execute() 42613 1727204593.96415: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204593.96430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204593.96448: variable 'omit' from source: magic vars 42613 1727204593.96913: variable 'ansible_distribution_major_version' from source: facts 42613 1727204593.96944: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204593.96957: variable 'omit' from source: magic vars 42613 1727204593.97028: variable 'omit' from source: magic vars 42613 1727204593.97155: variable 'omit' from source: magic vars 42613 1727204593.97161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204593.97201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204593.97230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204593.97263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204593.97293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204593.97318: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204593.97322: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204593.97324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204593.97430: Set connection var ansible_shell_executable to /bin/sh 42613 1727204593.97434: Set connection var ansible_pipelining to False 42613 1727204593.97445: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204593.97447: Set connection var ansible_connection to ssh 42613 1727204593.97452: Set connection var ansible_timeout to 10 42613 1727204593.97455: Set connection var ansible_shell_type to sh 42613 1727204593.97478: variable 'ansible_shell_executable' from source: unknown 42613 1727204593.97481: variable 'ansible_connection' from source: unknown 42613 1727204593.97488: variable 'ansible_module_compression' from source: unknown 42613 1727204593.97491: variable 'ansible_shell_type' from source: unknown 42613 1727204593.97494: variable 'ansible_shell_executable' from source: unknown 42613 1727204593.97497: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204593.97499: variable 'ansible_pipelining' from source: unknown 42613 1727204593.97501: variable 'ansible_timeout' from source: unknown 42613 1727204593.97505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204593.97637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204593.97649: variable 'omit' from source: magic vars 42613 1727204593.97654: starting attempt loop 42613 1727204593.97658: running the handler 42613 1727204593.97703: variable '__network_connections_result' from source: set_fact 42613 1727204593.97773: variable '__network_connections_result' from source: set_fact 42613 1727204593.98310: handler run complete 42613 1727204593.98367: attempt loop complete, returning result 42613 1727204593.98371: _execute() done 42613 1727204593.98374: dumping result to json 42613 1727204593.98381: done dumping result, returning 42613 1727204593.98390: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-2f91-05d8-00000000002a] 42613 1727204593.98395: sending task result for task 127b8e07-fff9-2f91-05d8-00000000002a 42613 1727204593.98524: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000002a 42613 1727204593.98527: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4b766b42-9d6e-4768-952e-daf596460f29 (not-active)" ] } } 42613 1727204593.98727: no more pending results, returning what we have 42613 1727204593.98731: results queue empty 42613 1727204593.98732: checking for any_errors_fatal 42613 1727204593.98737: done checking for any_errors_fatal 42613 1727204593.98737: checking for max_fail_percentage 42613 1727204593.98739: done checking for max_fail_percentage 42613 1727204593.98740: checking to see if all hosts have failed and the running result is not ok 42613 1727204593.98741: done checking to see if all hosts have failed 42613 1727204593.98741: getting the remaining hosts for this loop 42613 1727204593.98743: done getting the remaining hosts for this loop 42613 1727204593.98753: getting the next task for host managed-node3 42613 1727204593.98759: done getting next task for host managed-node3 42613 1727204593.98763: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 42613 1727204593.98768: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204593.98780: getting variables 42613 1727204593.98781: in VariableManager get_vars() 42613 1727204593.98815: Calling all_inventory to load vars for managed-node3 42613 1727204593.98817: Calling groups_inventory to load vars for managed-node3 42613 1727204593.98820: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204593.98827: Calling all_plugins_play to load vars for managed-node3 42613 1727204593.98829: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204593.98831: Calling groups_plugins_play to load vars for managed-node3 42613 1727204594.00387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204594.02556: done with get_vars() 42613 1727204594.02593: done getting variables 42613 1727204594.02659: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.071) 0:00:22.635 ***** 42613 1727204594.02698: entering _queue_task() for managed-node3/debug 42613 1727204594.03083: worker is 1 (out of 1 available) 42613 1727204594.03098: exiting _queue_task() for managed-node3/debug 42613 1727204594.03111: done queuing things up, now waiting for results queue to drain 42613 1727204594.03112: waiting for pending results... 42613 1727204594.03591: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 42613 1727204594.03598: in run() - task 127b8e07-fff9-2f91-05d8-00000000002b 42613 1727204594.03611: variable 'ansible_search_path' from source: unknown 42613 1727204594.03620: variable 'ansible_search_path' from source: unknown 42613 1727204594.03670: calling self._execute() 42613 1727204594.03783: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204594.03903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204594.03907: variable 'omit' from source: magic vars 42613 1727204594.04253: variable 'ansible_distribution_major_version' from source: facts 42613 1727204594.04275: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204594.04415: variable 'network_state' from source: role '' defaults 42613 1727204594.04435: Evaluated conditional (network_state != {}): False 42613 1727204594.04447: when evaluation is False, skipping this task 42613 1727204594.04456: _execute() done 42613 1727204594.04464: dumping result to json 42613 1727204594.04475: done dumping result, returning 42613 1727204594.04489: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-2f91-05d8-00000000002b] 42613 1727204594.04499: sending task result for task 127b8e07-fff9-2f91-05d8-00000000002b skipping: [managed-node3] => { "false_condition": "network_state != {}" } 42613 1727204594.04668: no more pending results, returning what we have 42613 1727204594.04672: results queue empty 42613 1727204594.04674: checking for any_errors_fatal 42613 1727204594.04691: done checking for any_errors_fatal 42613 1727204594.04692: checking for max_fail_percentage 42613 1727204594.04695: done checking for max_fail_percentage 42613 1727204594.04696: checking to see if all hosts have failed and the running result is not ok 42613 1727204594.04697: done checking to see if all hosts have failed 42613 1727204594.04697: getting the remaining hosts for this loop 42613 1727204594.04699: done getting the remaining hosts for this loop 42613 1727204594.04704: getting the next task for host managed-node3 42613 1727204594.04713: done getting next task for host managed-node3 42613 1727204594.04717: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 42613 1727204594.04721: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204594.04740: getting variables 42613 1727204594.04742: in VariableManager get_vars() 42613 1727204594.05086: Calling all_inventory to load vars for managed-node3 42613 1727204594.05089: Calling groups_inventory to load vars for managed-node3 42613 1727204594.05092: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204594.05103: Calling all_plugins_play to load vars for managed-node3 42613 1727204594.05106: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204594.05109: Calling groups_plugins_play to load vars for managed-node3 42613 1727204594.05786: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000002b 42613 1727204594.05790: WORKER PROCESS EXITING 42613 1727204594.06937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204594.09024: done with get_vars() 42613 1727204594.09052: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.064) 0:00:22.699 ***** 42613 1727204594.09144: entering _queue_task() for managed-node3/ping 42613 1727204594.09146: Creating lock for ping 42613 1727204594.09441: worker is 1 (out of 1 available) 42613 1727204594.09456: exiting _queue_task() for managed-node3/ping 42613 1727204594.09472: done queuing things up, now waiting for results queue to drain 42613 1727204594.09474: waiting for pending results... 42613 1727204594.09676: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 42613 1727204594.09784: in run() - task 127b8e07-fff9-2f91-05d8-00000000002c 42613 1727204594.09795: variable 'ansible_search_path' from source: unknown 42613 1727204594.09799: variable 'ansible_search_path' from source: unknown 42613 1727204594.09834: calling self._execute() 42613 1727204594.09914: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204594.09919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204594.09933: variable 'omit' from source: magic vars 42613 1727204594.10233: variable 'ansible_distribution_major_version' from source: facts 42613 1727204594.10246: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204594.10259: variable 'omit' from source: magic vars 42613 1727204594.10304: variable 'omit' from source: magic vars 42613 1727204594.10332: variable 'omit' from source: magic vars 42613 1727204594.10374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204594.10405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204594.10423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204594.10438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204594.10450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204594.10481: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204594.10484: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204594.10487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204594.10570: Set connection var ansible_shell_executable to /bin/sh 42613 1727204594.10574: Set connection var ansible_pipelining to False 42613 1727204594.10587: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204594.10590: Set connection var ansible_connection to ssh 42613 1727204594.10592: Set connection var ansible_timeout to 10 42613 1727204594.10595: Set connection var ansible_shell_type to sh 42613 1727204594.10613: variable 'ansible_shell_executable' from source: unknown 42613 1727204594.10623: variable 'ansible_connection' from source: unknown 42613 1727204594.10627: variable 'ansible_module_compression' from source: unknown 42613 1727204594.10629: variable 'ansible_shell_type' from source: unknown 42613 1727204594.10632: variable 'ansible_shell_executable' from source: unknown 42613 1727204594.10634: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204594.10636: variable 'ansible_pipelining' from source: unknown 42613 1727204594.10638: variable 'ansible_timeout' from source: unknown 42613 1727204594.10655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204594.10883: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204594.11045: variable 'omit' from source: magic vars 42613 1727204594.11049: starting attempt loop 42613 1727204594.11052: running the handler 42613 1727204594.11054: _low_level_execute_command(): starting 42613 1727204594.11056: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204594.11786: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.11791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.11930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.13757: stdout chunk (state=3): >>>/root <<< 42613 1727204594.13862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.13933: stderr chunk (state=3): >>><<< 42613 1727204594.13936: stdout chunk (state=3): >>><<< 42613 1727204594.13962: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204594.14006: _low_level_execute_command(): starting 42613 1727204594.14011: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318 `" && echo ansible-tmp-1727204594.1396303-44467-206488456135318="` echo /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318 `" ) && sleep 0' 42613 1727204594.14647: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204594.14653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204594.14675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.14708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204594.14711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.14714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.14724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.14779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.14783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.14824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.15012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.17160: stdout chunk (state=3): >>>ansible-tmp-1727204594.1396303-44467-206488456135318=/root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318 <<< 42613 1727204594.17292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.17377: stderr chunk (state=3): >>><<< 42613 1727204594.17381: stdout chunk (state=3): >>><<< 42613 1727204594.17576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204594.1396303-44467-206488456135318=/root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204594.17581: variable 'ansible_module_compression' from source: unknown 42613 1727204594.17583: ANSIBALLZ: Using lock for ping 42613 1727204594.17585: ANSIBALLZ: Acquiring lock 42613 1727204594.17588: ANSIBALLZ: Lock acquired: 139982759698048 42613 1727204594.17590: ANSIBALLZ: Creating module 42613 1727204594.32950: ANSIBALLZ: Writing module into payload 42613 1727204594.33000: ANSIBALLZ: Writing module 42613 1727204594.33021: ANSIBALLZ: Renaming module 42613 1727204594.33027: ANSIBALLZ: Done creating module 42613 1727204594.33042: variable 'ansible_facts' from source: unknown 42613 1727204594.33088: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/AnsiballZ_ping.py 42613 1727204594.33204: Sending initial data 42613 1727204594.33208: Sent initial data (153 bytes) 42613 1727204594.33739: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.33746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204594.33749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.33807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.33811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.33900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.35696: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204594.35756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204594.35830: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpf3nxkrl4 /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/AnsiballZ_ping.py <<< 42613 1727204594.35837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/AnsiballZ_ping.py" <<< 42613 1727204594.35895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpf3nxkrl4" to remote "/root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/AnsiballZ_ping.py" <<< 42613 1727204594.35898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/AnsiballZ_ping.py" <<< 42613 1727204594.36552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.36632: stderr chunk (state=3): >>><<< 42613 1727204594.36636: stdout chunk (state=3): >>><<< 42613 1727204594.36656: done transferring module to remote 42613 1727204594.36669: _low_level_execute_command(): starting 42613 1727204594.36679: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/ /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/AnsiballZ_ping.py && sleep 0' 42613 1727204594.37162: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.37187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204594.37195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.37252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.37255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.37258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.37335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.39373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.39407: stderr chunk (state=3): >>><<< 42613 1727204594.39411: stdout chunk (state=3): >>><<< 42613 1727204594.39427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204594.39430: _low_level_execute_command(): starting 42613 1727204594.39435: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/AnsiballZ_ping.py && sleep 0' 42613 1727204594.39961: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.39967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204594.39970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204594.39972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.40026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.40030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.40032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.40117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.57598: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 42613 1727204594.58922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.58976: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 42613 1727204594.59304: stderr chunk (state=3): >>><<< 42613 1727204594.59308: stdout chunk (state=3): >>><<< 42613 1727204594.59311: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204594.59315: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204594.59317: _low_level_execute_command(): starting 42613 1727204594.59319: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204594.1396303-44467-206488456135318/ > /dev/null 2>&1 && sleep 0' 42613 1727204594.60595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.60599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204594.60715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.60729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.60799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.60823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.60944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.63273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.63277: stdout chunk (state=3): >>><<< 42613 1727204594.63279: stderr chunk (state=3): >>><<< 42613 1727204594.63281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204594.63283: handler run complete 42613 1727204594.63285: attempt loop complete, returning result 42613 1727204594.63287: _execute() done 42613 1727204594.63289: dumping result to json 42613 1727204594.63290: done dumping result, returning 42613 1727204594.63292: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-2f91-05d8-00000000002c] 42613 1727204594.63294: sending task result for task 127b8e07-fff9-2f91-05d8-00000000002c 42613 1727204594.63363: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000002c 42613 1727204594.63368: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 42613 1727204594.63425: no more pending results, returning what we have 42613 1727204594.63430: results queue empty 42613 1727204594.63431: checking for any_errors_fatal 42613 1727204594.63441: done checking for any_errors_fatal 42613 1727204594.63442: checking for max_fail_percentage 42613 1727204594.63444: done checking for max_fail_percentage 42613 1727204594.63444: checking to see if all hosts have failed and the running result is not ok 42613 1727204594.63445: done checking to see if all hosts have failed 42613 1727204594.63446: getting the remaining hosts for this loop 42613 1727204594.63448: done getting the remaining hosts for this loop 42613 1727204594.63452: getting the next task for host managed-node3 42613 1727204594.63462: done getting next task for host managed-node3 42613 1727204594.63469: ^ task is: TASK: meta (role_complete) 42613 1727204594.63472: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204594.63484: getting variables 42613 1727204594.63485: in VariableManager get_vars() 42613 1727204594.63522: Calling all_inventory to load vars for managed-node3 42613 1727204594.63525: Calling groups_inventory to load vars for managed-node3 42613 1727204594.63527: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204594.63536: Calling all_plugins_play to load vars for managed-node3 42613 1727204594.63540: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204594.63543: Calling groups_plugins_play to load vars for managed-node3 42613 1727204594.65412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204594.67650: done with get_vars() 42613 1727204594.67681: done getting variables 42613 1727204594.67765: done queuing things up, now waiting for results queue to drain 42613 1727204594.67769: results queue empty 42613 1727204594.67770: checking for any_errors_fatal 42613 1727204594.67773: done checking for any_errors_fatal 42613 1727204594.67773: checking for max_fail_percentage 42613 1727204594.67775: done checking for max_fail_percentage 42613 1727204594.67775: checking to see if all hosts have failed and the running result is not ok 42613 1727204594.67776: done checking to see if all hosts have failed 42613 1727204594.67777: getting the remaining hosts for this loop 42613 1727204594.67778: done getting the remaining hosts for this loop 42613 1727204594.67780: getting the next task for host managed-node3 42613 1727204594.67784: done getting next task for host managed-node3 42613 1727204594.67786: ^ task is: TASK: Get the routing rule for looking up the table 30200 42613 1727204594.67788: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204594.67790: getting variables 42613 1727204594.67791: in VariableManager get_vars() 42613 1727204594.67804: Calling all_inventory to load vars for managed-node3 42613 1727204594.67806: Calling groups_inventory to load vars for managed-node3 42613 1727204594.67808: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204594.67813: Calling all_plugins_play to load vars for managed-node3 42613 1727204594.67815: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204594.67817: Calling groups_plugins_play to load vars for managed-node3 42613 1727204594.75086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204594.77239: done with get_vars() 42613 1727204594.77278: done getting variables 42613 1727204594.77330: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30200] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:115 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.682) 0:00:23.381 ***** 42613 1727204594.77361: entering _queue_task() for managed-node3/command 42613 1727204594.77827: worker is 1 (out of 1 available) 42613 1727204594.77846: exiting _queue_task() for managed-node3/command 42613 1727204594.77861: done queuing things up, now waiting for results queue to drain 42613 1727204594.77863: waiting for pending results... 42613 1727204594.78206: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30200 42613 1727204594.78349: in run() - task 127b8e07-fff9-2f91-05d8-00000000005c 42613 1727204594.78377: variable 'ansible_search_path' from source: unknown 42613 1727204594.78430: calling self._execute() 42613 1727204594.78572: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204594.78588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204594.78604: variable 'omit' from source: magic vars 42613 1727204594.79077: variable 'ansible_distribution_major_version' from source: facts 42613 1727204594.79103: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204594.79228: variable 'ansible_distribution_major_version' from source: facts 42613 1727204594.79232: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204594.79241: variable 'omit' from source: magic vars 42613 1727204594.79264: variable 'omit' from source: magic vars 42613 1727204594.79295: variable 'omit' from source: magic vars 42613 1727204594.79336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204594.79368: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204594.79390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204594.79406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204594.79419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204594.79450: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204594.79454: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204594.79456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204594.79540: Set connection var ansible_shell_executable to /bin/sh 42613 1727204594.79546: Set connection var ansible_pipelining to False 42613 1727204594.79553: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204594.79555: Set connection var ansible_connection to ssh 42613 1727204594.79561: Set connection var ansible_timeout to 10 42613 1727204594.79564: Set connection var ansible_shell_type to sh 42613 1727204594.79586: variable 'ansible_shell_executable' from source: unknown 42613 1727204594.79589: variable 'ansible_connection' from source: unknown 42613 1727204594.79592: variable 'ansible_module_compression' from source: unknown 42613 1727204594.79595: variable 'ansible_shell_type' from source: unknown 42613 1727204594.79597: variable 'ansible_shell_executable' from source: unknown 42613 1727204594.79600: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204594.79603: variable 'ansible_pipelining' from source: unknown 42613 1727204594.79605: variable 'ansible_timeout' from source: unknown 42613 1727204594.79610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204594.79734: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204594.79746: variable 'omit' from source: magic vars 42613 1727204594.79751: starting attempt loop 42613 1727204594.79754: running the handler 42613 1727204594.79770: _low_level_execute_command(): starting 42613 1727204594.79798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204594.80360: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.80368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.80372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.80411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.80433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.80513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.82346: stdout chunk (state=3): >>>/root <<< 42613 1727204594.82454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.82516: stderr chunk (state=3): >>><<< 42613 1727204594.82520: stdout chunk (state=3): >>><<< 42613 1727204594.82550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204594.82560: _low_level_execute_command(): starting 42613 1727204594.82570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473 `" && echo ansible-tmp-1727204594.8254828-44496-274561517793473="` echo /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473 `" ) && sleep 0' 42613 1727204594.83061: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.83066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.83085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204594.83090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.83128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.83131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.83139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.83213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.85387: stdout chunk (state=3): >>>ansible-tmp-1727204594.8254828-44496-274561517793473=/root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473 <<< 42613 1727204594.85608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.85612: stdout chunk (state=3): >>><<< 42613 1727204594.85616: stderr chunk (state=3): >>><<< 42613 1727204594.85639: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204594.8254828-44496-274561517793473=/root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204594.85715: variable 'ansible_module_compression' from source: unknown 42613 1727204594.85760: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204594.85821: variable 'ansible_facts' from source: unknown 42613 1727204594.85936: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/AnsiballZ_command.py 42613 1727204594.86184: Sending initial data 42613 1727204594.86187: Sent initial data (156 bytes) 42613 1727204594.86781: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204594.86826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.86933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.86957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.87064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.88836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204594.88900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204594.88974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpmkfrdt1d /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/AnsiballZ_command.py <<< 42613 1727204594.88978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/AnsiballZ_command.py" <<< 42613 1727204594.89054: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpmkfrdt1d" to remote "/root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/AnsiballZ_command.py" <<< 42613 1727204594.89947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.90034: stderr chunk (state=3): >>><<< 42613 1727204594.90055: stdout chunk (state=3): >>><<< 42613 1727204594.90089: done transferring module to remote 42613 1727204594.90138: _low_level_execute_command(): starting 42613 1727204594.90142: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/ /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/AnsiballZ_command.py && sleep 0' 42613 1727204594.90856: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204594.90973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204594.90990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.91035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.91056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.91083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.91192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204594.93381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204594.93386: stdout chunk (state=3): >>><<< 42613 1727204594.93388: stderr chunk (state=3): >>><<< 42613 1727204594.93390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204594.93393: _low_level_execute_command(): starting 42613 1727204594.93395: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/AnsiballZ_command.py && sleep 0' 42613 1727204594.94056: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204594.94079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204594.94151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204594.94217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204594.94238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204594.94276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204594.94399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.12691: stdout chunk (state=3): >>> {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos 0x08 lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 15:03:15.119567", "end": "2024-09-24 15:03:15.125560", "delta": "0:00:00.005993", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204595.14468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204595.14529: stderr chunk (state=3): >>><<< 42613 1727204595.14532: stdout chunk (state=3): >>><<< 42613 1727204595.14552: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos 0x08 lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 15:03:15.119567", "end": "2024-09-24 15:03:15.125560", "delta": "0:00:00.005993", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204595.14584: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204595.14592: _low_level_execute_command(): starting 42613 1727204595.14598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204594.8254828-44496-274561517793473/ > /dev/null 2>&1 && sleep 0' 42613 1727204595.15098: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.15106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.15109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.15111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.15159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.15163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.15239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.22125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.22164: stderr chunk (state=3): >>><<< 42613 1727204595.22168: stdout chunk (state=3): >>><<< 42613 1727204595.22186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.22192: handler run complete 42613 1727204595.22215: Evaluated conditional (False): False 42613 1727204595.22225: attempt loop complete, returning result 42613 1727204595.22227: _execute() done 42613 1727204595.22230: dumping result to json 42613 1727204595.22236: done dumping result, returning 42613 1727204595.22245: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30200 [127b8e07-fff9-2f91-05d8-00000000005c] 42613 1727204595.22250: sending task result for task 127b8e07-fff9-2f91-05d8-00000000005c 42613 1727204595.22363: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000005c 42613 1727204595.22367: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30200" ], "delta": "0:00:00.005993", "end": "2024-09-24 15:03:15.125560", "rc": 0, "start": "2024-09-24 15:03:15.119567" } STDOUT: 30200: from 198.51.100.58/26 lookup 30200 proto static 30201: from all fwmark 0x1/0x1 lookup 30200 proto static 30202: from all ipproto tcp lookup 30200 proto static 30203: from all sport 128-256 lookup 30200 proto static 30204: from all tos 0x08 lookup 30200 proto static 42613 1727204595.22468: no more pending results, returning what we have 42613 1727204595.22472: results queue empty 42613 1727204595.22473: checking for any_errors_fatal 42613 1727204595.22475: done checking for any_errors_fatal 42613 1727204595.22477: checking for max_fail_percentage 42613 1727204595.22479: done checking for max_fail_percentage 42613 1727204595.22480: checking to see if all hosts have failed and the running result is not ok 42613 1727204595.22481: done checking to see if all hosts have failed 42613 1727204595.22482: getting the remaining hosts for this loop 42613 1727204595.22483: done getting the remaining hosts for this loop 42613 1727204595.22487: getting the next task for host managed-node3 42613 1727204595.22494: done getting next task for host managed-node3 42613 1727204595.22496: ^ task is: TASK: Get the routing rule for looking up the table 30400 42613 1727204595.22498: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204595.22502: getting variables 42613 1727204595.22504: in VariableManager get_vars() 42613 1727204595.22540: Calling all_inventory to load vars for managed-node3 42613 1727204595.22543: Calling groups_inventory to load vars for managed-node3 42613 1727204595.22545: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204595.22556: Calling all_plugins_play to load vars for managed-node3 42613 1727204595.22559: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204595.22561: Calling groups_plugins_play to load vars for managed-node3 42613 1727204595.23680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204595.24923: done with get_vars() 42613 1727204595.24952: done getting variables 42613 1727204595.25006: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30400] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:122 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.476) 0:00:23.858 ***** 42613 1727204595.25028: entering _queue_task() for managed-node3/command 42613 1727204595.25322: worker is 1 (out of 1 available) 42613 1727204595.25336: exiting _queue_task() for managed-node3/command 42613 1727204595.25352: done queuing things up, now waiting for results queue to drain 42613 1727204595.25353: waiting for pending results... 42613 1727204595.25555: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30400 42613 1727204595.25637: in run() - task 127b8e07-fff9-2f91-05d8-00000000005d 42613 1727204595.25650: variable 'ansible_search_path' from source: unknown 42613 1727204595.25687: calling self._execute() 42613 1727204595.25782: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204595.25788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204595.25799: variable 'omit' from source: magic vars 42613 1727204595.26123: variable 'ansible_distribution_major_version' from source: facts 42613 1727204595.26135: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204595.26229: variable 'ansible_distribution_major_version' from source: facts 42613 1727204595.26233: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204595.26244: variable 'omit' from source: magic vars 42613 1727204595.26264: variable 'omit' from source: magic vars 42613 1727204595.26297: variable 'omit' from source: magic vars 42613 1727204595.26334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204595.26370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204595.26388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204595.26403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204595.26413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204595.26439: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204595.26445: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204595.26449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204595.26529: Set connection var ansible_shell_executable to /bin/sh 42613 1727204595.26533: Set connection var ansible_pipelining to False 42613 1727204595.26543: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204595.26546: Set connection var ansible_connection to ssh 42613 1727204595.26552: Set connection var ansible_timeout to 10 42613 1727204595.26554: Set connection var ansible_shell_type to sh 42613 1727204595.26577: variable 'ansible_shell_executable' from source: unknown 42613 1727204595.26581: variable 'ansible_connection' from source: unknown 42613 1727204595.26586: variable 'ansible_module_compression' from source: unknown 42613 1727204595.26588: variable 'ansible_shell_type' from source: unknown 42613 1727204595.26590: variable 'ansible_shell_executable' from source: unknown 42613 1727204595.26593: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204595.26595: variable 'ansible_pipelining' from source: unknown 42613 1727204595.26597: variable 'ansible_timeout' from source: unknown 42613 1727204595.26599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204595.26718: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204595.26728: variable 'omit' from source: magic vars 42613 1727204595.26733: starting attempt loop 42613 1727204595.26736: running the handler 42613 1727204595.26752: _low_level_execute_command(): starting 42613 1727204595.26759: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204595.27326: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.27332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.27336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.27394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.27398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.27481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.29295: stdout chunk (state=3): >>>/root <<< 42613 1727204595.29423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.29462: stderr chunk (state=3): >>><<< 42613 1727204595.29468: stdout chunk (state=3): >>><<< 42613 1727204595.29493: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.29505: _low_level_execute_command(): starting 42613 1727204595.29514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329 `" && echo ansible-tmp-1727204595.294924-44510-213983408837329="` echo /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329 `" ) && sleep 0' 42613 1727204595.30021: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.30026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.30036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204595.30039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.30095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.30104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.30108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.30175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.32310: stdout chunk (state=3): >>>ansible-tmp-1727204595.294924-44510-213983408837329=/root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329 <<< 42613 1727204595.32420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.32486: stderr chunk (state=3): >>><<< 42613 1727204595.32490: stdout chunk (state=3): >>><<< 42613 1727204595.32507: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204595.294924-44510-213983408837329=/root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.32540: variable 'ansible_module_compression' from source: unknown 42613 1727204595.32591: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204595.32627: variable 'ansible_facts' from source: unknown 42613 1727204595.32684: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/AnsiballZ_command.py 42613 1727204595.32801: Sending initial data 42613 1727204595.32804: Sent initial data (155 bytes) 42613 1727204595.33311: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.33316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.33318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.33382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.33386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.33474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.35239: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204595.35306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204595.35376: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpynyu10us /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/AnsiballZ_command.py <<< 42613 1727204595.35379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/AnsiballZ_command.py" <<< 42613 1727204595.35438: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpynyu10us" to remote "/root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/AnsiballZ_command.py" <<< 42613 1727204595.35442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/AnsiballZ_command.py" <<< 42613 1727204595.36101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.36180: stderr chunk (state=3): >>><<< 42613 1727204595.36184: stdout chunk (state=3): >>><<< 42613 1727204595.36204: done transferring module to remote 42613 1727204595.36216: _low_level_execute_command(): starting 42613 1727204595.36221: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/ /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/AnsiballZ_command.py && sleep 0' 42613 1727204595.36727: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204595.36731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.36733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.36736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.36802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.36805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.36807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.36872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.38861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.38925: stderr chunk (state=3): >>><<< 42613 1727204595.38928: stdout chunk (state=3): >>><<< 42613 1727204595.38947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.38951: _low_level_execute_command(): starting 42613 1727204595.38954: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/AnsiballZ_command.py && sleep 0' 42613 1727204595.39467: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204595.39472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204595.39475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204595.39478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.39530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.39533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.39539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.39622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.57639: stdout chunk (state=3): >>> {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 15:03:15.570858", "end": "2024-09-24 15:03:15.575100", "delta": "0:00:00.004242", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204595.59460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204595.59519: stderr chunk (state=3): >>><<< 42613 1727204595.59522: stdout chunk (state=3): >>><<< 42613 1727204595.59542: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 15:03:15.570858", "end": "2024-09-24 15:03:15.575100", "delta": "0:00:00.004242", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204595.59577: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204595.59585: _low_level_execute_command(): starting 42613 1727204595.59590: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204595.294924-44510-213983408837329/ > /dev/null 2>&1 && sleep 0' 42613 1727204595.60101: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.60105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204595.60114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204595.60117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.60181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.60184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.60185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.60248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.62479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.62484: stdout chunk (state=3): >>><<< 42613 1727204595.62487: stderr chunk (state=3): >>><<< 42613 1727204595.62490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.62492: handler run complete 42613 1727204595.62495: Evaluated conditional (False): False 42613 1727204595.62516: attempt loop complete, returning result 42613 1727204595.62526: _execute() done 42613 1727204595.62534: dumping result to json 42613 1727204595.62550: done dumping result, returning 42613 1727204595.62569: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30400 [127b8e07-fff9-2f91-05d8-00000000005d] 42613 1727204595.62590: sending task result for task 127b8e07-fff9-2f91-05d8-00000000005d ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30400" ], "delta": "0:00:00.004242", "end": "2024-09-24 15:03:15.575100", "rc": 0, "start": "2024-09-24 15:03:15.570858" } STDOUT: 30400: from all to 198.51.100.128/26 lookup 30400 proto static 30401: from all iif iiftest [detached] lookup 30400 proto static 30402: from all oif oiftest [detached] lookup 30400 proto static 30403: from all lookup 30400 proto static 42613 1727204595.62841: no more pending results, returning what we have 42613 1727204595.62845: results queue empty 42613 1727204595.62846: checking for any_errors_fatal 42613 1727204595.62858: done checking for any_errors_fatal 42613 1727204595.62859: checking for max_fail_percentage 42613 1727204595.62862: done checking for max_fail_percentage 42613 1727204595.62863: checking to see if all hosts have failed and the running result is not ok 42613 1727204595.62864: done checking to see if all hosts have failed 42613 1727204595.62864: getting the remaining hosts for this loop 42613 1727204595.63082: done getting the remaining hosts for this loop 42613 1727204595.63088: getting the next task for host managed-node3 42613 1727204595.63094: done getting next task for host managed-node3 42613 1727204595.63098: ^ task is: TASK: Get the routing rule for looking up the table 30600 42613 1727204595.63100: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204595.63104: getting variables 42613 1727204595.63106: in VariableManager get_vars() 42613 1727204595.63150: Calling all_inventory to load vars for managed-node3 42613 1727204595.63154: Calling groups_inventory to load vars for managed-node3 42613 1727204595.63156: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204595.63179: Calling all_plugins_play to load vars for managed-node3 42613 1727204595.63184: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204595.63288: Calling groups_plugins_play to load vars for managed-node3 42613 1727204595.63887: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000005d 42613 1727204595.63892: WORKER PROCESS EXITING 42613 1727204595.64994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204595.66212: done with get_vars() 42613 1727204595.66241: done getting variables 42613 1727204595.66293: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30600] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:129 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.412) 0:00:24.271 ***** 42613 1727204595.66316: entering _queue_task() for managed-node3/command 42613 1727204595.66618: worker is 1 (out of 1 available) 42613 1727204595.66633: exiting _queue_task() for managed-node3/command 42613 1727204595.66647: done queuing things up, now waiting for results queue to drain 42613 1727204595.66649: waiting for pending results... 42613 1727204595.66859: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30600 42613 1727204595.66932: in run() - task 127b8e07-fff9-2f91-05d8-00000000005e 42613 1727204595.66945: variable 'ansible_search_path' from source: unknown 42613 1727204595.66984: calling self._execute() 42613 1727204595.67072: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204595.67079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204595.67091: variable 'omit' from source: magic vars 42613 1727204595.67416: variable 'ansible_distribution_major_version' from source: facts 42613 1727204595.67431: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204595.67516: variable 'ansible_distribution_major_version' from source: facts 42613 1727204595.67521: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204595.67529: variable 'omit' from source: magic vars 42613 1727204595.67551: variable 'omit' from source: magic vars 42613 1727204595.67581: variable 'omit' from source: magic vars 42613 1727204595.67621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204595.67654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204595.67674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204595.67688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204595.67698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204595.67725: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204595.67728: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204595.67731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204595.67817: Set connection var ansible_shell_executable to /bin/sh 42613 1727204595.67821: Set connection var ansible_pipelining to False 42613 1727204595.67829: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204595.67832: Set connection var ansible_connection to ssh 42613 1727204595.67840: Set connection var ansible_timeout to 10 42613 1727204595.67842: Set connection var ansible_shell_type to sh 42613 1727204595.67864: variable 'ansible_shell_executable' from source: unknown 42613 1727204595.67871: variable 'ansible_connection' from source: unknown 42613 1727204595.67874: variable 'ansible_module_compression' from source: unknown 42613 1727204595.67876: variable 'ansible_shell_type' from source: unknown 42613 1727204595.67878: variable 'ansible_shell_executable' from source: unknown 42613 1727204595.67880: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204595.67883: variable 'ansible_pipelining' from source: unknown 42613 1727204595.67885: variable 'ansible_timeout' from source: unknown 42613 1727204595.67887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204595.68005: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204595.68013: variable 'omit' from source: magic vars 42613 1727204595.68018: starting attempt loop 42613 1727204595.68022: running the handler 42613 1727204595.68036: _low_level_execute_command(): starting 42613 1727204595.68043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204595.68619: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.68624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.68628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.68688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.68692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.68699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.68780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.70607: stdout chunk (state=3): >>>/root <<< 42613 1727204595.70716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.70771: stderr chunk (state=3): >>><<< 42613 1727204595.70775: stdout chunk (state=3): >>><<< 42613 1727204595.70800: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.70811: _low_level_execute_command(): starting 42613 1727204595.70825: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469 `" && echo ansible-tmp-1727204595.7079935-44525-224083011116469="` echo /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469 `" ) && sleep 0' 42613 1727204595.71332: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.71337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.71348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.71351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.71406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.71410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.71417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.71491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.73651: stdout chunk (state=3): >>>ansible-tmp-1727204595.7079935-44525-224083011116469=/root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469 <<< 42613 1727204595.73764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.73826: stderr chunk (state=3): >>><<< 42613 1727204595.73830: stdout chunk (state=3): >>><<< 42613 1727204595.73851: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204595.7079935-44525-224083011116469=/root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.73882: variable 'ansible_module_compression' from source: unknown 42613 1727204595.73924: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204595.73961: variable 'ansible_facts' from source: unknown 42613 1727204595.74014: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/AnsiballZ_command.py 42613 1727204595.74133: Sending initial data 42613 1727204595.74136: Sent initial data (156 bytes) 42613 1727204595.74653: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.74657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204595.74661: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204595.74663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.74714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.74718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.74722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.74794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.76558: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204595.76618: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204595.76689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp02z2xk0g /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/AnsiballZ_command.py <<< 42613 1727204595.76696: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/AnsiballZ_command.py" <<< 42613 1727204595.76758: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp02z2xk0g" to remote "/root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/AnsiballZ_command.py" <<< 42613 1727204595.76761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/AnsiballZ_command.py" <<< 42613 1727204595.77450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.77525: stderr chunk (state=3): >>><<< 42613 1727204595.77529: stdout chunk (state=3): >>><<< 42613 1727204595.77555: done transferring module to remote 42613 1727204595.77568: _low_level_execute_command(): starting 42613 1727204595.77575: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/ /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/AnsiballZ_command.py && sleep 0' 42613 1727204595.78070: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.78074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204595.78078: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204595.78086: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.78135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.78141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.78146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.78219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.80257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204595.80316: stderr chunk (state=3): >>><<< 42613 1727204595.80319: stdout chunk (state=3): >>><<< 42613 1727204595.80335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204595.80341: _low_level_execute_command(): starting 42613 1727204595.80344: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/AnsiballZ_command.py && sleep 0' 42613 1727204595.80846: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204595.80851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.80853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204595.80863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204595.80926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204595.80936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204595.80940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204595.81012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204595.98944: stdout chunk (state=3): >>> {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 15:03:15.983997", "end": "2024-09-24 15:03:15.988137", "delta": "0:00:00.004140", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204596.00847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204596.00851: stdout chunk (state=3): >>><<< 42613 1727204596.00854: stderr chunk (state=3): >>><<< 42613 1727204596.00879: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 15:03:15.983997", "end": "2024-09-24 15:03:15.988137", "delta": "0:00:00.004140", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204596.00958: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 rule list table 30600', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204596.00962: _low_level_execute_command(): starting 42613 1727204596.00967: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204595.7079935-44525-224083011116469/ > /dev/null 2>&1 && sleep 0' 42613 1727204596.01691: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.01777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.01800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.01897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.04000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.04046: stderr chunk (state=3): >>><<< 42613 1727204596.04058: stdout chunk (state=3): >>><<< 42613 1727204596.04087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.04116: handler run complete 42613 1727204596.04271: Evaluated conditional (False): False 42613 1727204596.04274: attempt loop complete, returning result 42613 1727204596.04277: _execute() done 42613 1727204596.04279: dumping result to json 42613 1727204596.04281: done dumping result, returning 42613 1727204596.04283: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30600 [127b8e07-fff9-2f91-05d8-00000000005e] 42613 1727204596.04285: sending task result for task 127b8e07-fff9-2f91-05d8-00000000005e 42613 1727204596.04364: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000005e 42613 1727204596.04369: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "-6", "rule", "list", "table", "30600" ], "delta": "0:00:00.004140", "end": "2024-09-24 15:03:15.988137", "rc": 0, "start": "2024-09-24 15:03:15.983997" } STDOUT: 30600: from all to 2001:db8::4/32 lookup 30600 proto static 30601: not from all dport 128-256 lookup 30600 proto static 30602: from all lookup 30600 proto static 42613 1727204596.04462: no more pending results, returning what we have 42613 1727204596.04469: results queue empty 42613 1727204596.04470: checking for any_errors_fatal 42613 1727204596.04480: done checking for any_errors_fatal 42613 1727204596.04481: checking for max_fail_percentage 42613 1727204596.04484: done checking for max_fail_percentage 42613 1727204596.04485: checking to see if all hosts have failed and the running result is not ok 42613 1727204596.04486: done checking to see if all hosts have failed 42613 1727204596.04486: getting the remaining hosts for this loop 42613 1727204596.04488: done getting the remaining hosts for this loop 42613 1727204596.04493: getting the next task for host managed-node3 42613 1727204596.04501: done getting next task for host managed-node3 42613 1727204596.04504: ^ task is: TASK: Get the routing rule for looking up the table 'custom' 42613 1727204596.04507: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204596.04511: getting variables 42613 1727204596.04512: in VariableManager get_vars() 42613 1727204596.04787: Calling all_inventory to load vars for managed-node3 42613 1727204596.04791: Calling groups_inventory to load vars for managed-node3 42613 1727204596.04794: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204596.04810: Calling all_plugins_play to load vars for managed-node3 42613 1727204596.04813: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204596.04817: Calling groups_plugins_play to load vars for managed-node3 42613 1727204596.06029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204596.07347: done with get_vars() 42613 1727204596.07384: done getting variables 42613 1727204596.07455: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 'custom'] ****************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:136 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.411) 0:00:24.683 ***** 42613 1727204596.07488: entering _queue_task() for managed-node3/command 42613 1727204596.07892: worker is 1 (out of 1 available) 42613 1727204596.07906: exiting _queue_task() for managed-node3/command 42613 1727204596.07919: done queuing things up, now waiting for results queue to drain 42613 1727204596.07920: waiting for pending results... 42613 1727204596.08490: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 'custom' 42613 1727204596.08496: in run() - task 127b8e07-fff9-2f91-05d8-00000000005f 42613 1727204596.08501: variable 'ansible_search_path' from source: unknown 42613 1727204596.08505: calling self._execute() 42613 1727204596.08617: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.08629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.08644: variable 'omit' from source: magic vars 42613 1727204596.09047: variable 'ansible_distribution_major_version' from source: facts 42613 1727204596.09058: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204596.09149: variable 'ansible_distribution_major_version' from source: facts 42613 1727204596.09153: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204596.09161: variable 'omit' from source: magic vars 42613 1727204596.09180: variable 'omit' from source: magic vars 42613 1727204596.09211: variable 'omit' from source: magic vars 42613 1727204596.09252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204596.09284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204596.09302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204596.09318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204596.09328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204596.09359: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204596.09363: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.09367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.09445: Set connection var ansible_shell_executable to /bin/sh 42613 1727204596.09449: Set connection var ansible_pipelining to False 42613 1727204596.09459: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204596.09464: Set connection var ansible_connection to ssh 42613 1727204596.09467: Set connection var ansible_timeout to 10 42613 1727204596.09472: Set connection var ansible_shell_type to sh 42613 1727204596.09493: variable 'ansible_shell_executable' from source: unknown 42613 1727204596.09496: variable 'ansible_connection' from source: unknown 42613 1727204596.09499: variable 'ansible_module_compression' from source: unknown 42613 1727204596.09502: variable 'ansible_shell_type' from source: unknown 42613 1727204596.09505: variable 'ansible_shell_executable' from source: unknown 42613 1727204596.09507: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.09510: variable 'ansible_pipelining' from source: unknown 42613 1727204596.09512: variable 'ansible_timeout' from source: unknown 42613 1727204596.09517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.09635: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204596.09646: variable 'omit' from source: magic vars 42613 1727204596.09651: starting attempt loop 42613 1727204596.09654: running the handler 42613 1727204596.09671: _low_level_execute_command(): starting 42613 1727204596.09684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204596.10259: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.10264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204596.10271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.10315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.10318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.10321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.10402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.12243: stdout chunk (state=3): >>>/root <<< 42613 1727204596.12457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.12461: stdout chunk (state=3): >>><<< 42613 1727204596.12464: stderr chunk (state=3): >>><<< 42613 1727204596.12574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.12581: _low_level_execute_command(): starting 42613 1727204596.12585: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566 `" && echo ansible-tmp-1727204596.1249857-44538-12791588435566="` echo /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566 `" ) && sleep 0' 42613 1727204596.13120: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.13135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.13171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.13211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.13218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.13222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.13293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.15495: stdout chunk (state=3): >>>ansible-tmp-1727204596.1249857-44538-12791588435566=/root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566 <<< 42613 1727204596.15595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.15660: stderr chunk (state=3): >>><<< 42613 1727204596.15665: stdout chunk (state=3): >>><<< 42613 1727204596.15688: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204596.1249857-44538-12791588435566=/root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.15718: variable 'ansible_module_compression' from source: unknown 42613 1727204596.15778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204596.15805: variable 'ansible_facts' from source: unknown 42613 1727204596.15857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/AnsiballZ_command.py 42613 1727204596.15983: Sending initial data 42613 1727204596.15986: Sent initial data (155 bytes) 42613 1727204596.16481: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204596.16484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204596.16487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.16490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204596.16492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204596.16495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.16546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.16550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.16626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.18410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204596.18476: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204596.18540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpitaelwon /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/AnsiballZ_command.py <<< 42613 1727204596.18547: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/AnsiballZ_command.py" <<< 42613 1727204596.18610: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpitaelwon" to remote "/root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/AnsiballZ_command.py" <<< 42613 1727204596.18615: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/AnsiballZ_command.py" <<< 42613 1727204596.19303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.19384: stderr chunk (state=3): >>><<< 42613 1727204596.19388: stdout chunk (state=3): >>><<< 42613 1727204596.19410: done transferring module to remote 42613 1727204596.19420: _low_level_execute_command(): starting 42613 1727204596.19425: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/ /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/AnsiballZ_command.py && sleep 0' 42613 1727204596.19928: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204596.19932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.19934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204596.19940: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204596.19942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.19990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.19994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.20073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.22097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.22159: stderr chunk (state=3): >>><<< 42613 1727204596.22163: stdout chunk (state=3): >>><<< 42613 1727204596.22182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.22186: _low_level_execute_command(): starting 42613 1727204596.22189: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/AnsiballZ_command.py && sleep 0' 42613 1727204596.22664: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.22672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204596.22691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.22694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.22705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.22771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.22775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.22780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.22860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.40811: stdout chunk (state=3): >>> {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 15:03:16.402816", "end": "2024-09-24 15:03:16.406874", "delta": "0:00:00.004058", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204596.42610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204596.42677: stderr chunk (state=3): >>><<< 42613 1727204596.42681: stdout chunk (state=3): >>><<< 42613 1727204596.42698: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 15:03:16.402816", "end": "2024-09-24 15:03:16.406874", "delta": "0:00:00.004058", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204596.42734: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204596.42743: _low_level_execute_command(): starting 42613 1727204596.42746: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204596.1249857-44538-12791588435566/ > /dev/null 2>&1 && sleep 0' 42613 1727204596.43264: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.43272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204596.43279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.43330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.43334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.43414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.45461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.45522: stderr chunk (state=3): >>><<< 42613 1727204596.45526: stdout chunk (state=3): >>><<< 42613 1727204596.45544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.45547: handler run complete 42613 1727204596.45569: Evaluated conditional (False): False 42613 1727204596.45582: attempt loop complete, returning result 42613 1727204596.45586: _execute() done 42613 1727204596.45588: dumping result to json 42613 1727204596.45593: done dumping result, returning 42613 1727204596.45602: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 'custom' [127b8e07-fff9-2f91-05d8-00000000005f] 42613 1727204596.45607: sending task result for task 127b8e07-fff9-2f91-05d8-00000000005f 42613 1727204596.45716: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000005f 42613 1727204596.45719: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "custom" ], "delta": "0:00:00.004058", "end": "2024-09-24 15:03:16.406874", "rc": 0, "start": "2024-09-24 15:03:16.402816" } STDOUT: 200: from 198.51.100.56/26 lookup custom proto static 42613 1727204596.45803: no more pending results, returning what we have 42613 1727204596.45807: results queue empty 42613 1727204596.45808: checking for any_errors_fatal 42613 1727204596.45819: done checking for any_errors_fatal 42613 1727204596.45819: checking for max_fail_percentage 42613 1727204596.45821: done checking for max_fail_percentage 42613 1727204596.45822: checking to see if all hosts have failed and the running result is not ok 42613 1727204596.45823: done checking to see if all hosts have failed 42613 1727204596.45824: getting the remaining hosts for this loop 42613 1727204596.45825: done getting the remaining hosts for this loop 42613 1727204596.45831: getting the next task for host managed-node3 42613 1727204596.45840: done getting next task for host managed-node3 42613 1727204596.45843: ^ task is: TASK: Get the IPv4 routing rule for the connection "{{ interface }}" 42613 1727204596.45845: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204596.45848: getting variables 42613 1727204596.45850: in VariableManager get_vars() 42613 1727204596.45889: Calling all_inventory to load vars for managed-node3 42613 1727204596.45891: Calling groups_inventory to load vars for managed-node3 42613 1727204596.45893: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204596.45904: Calling all_plugins_play to load vars for managed-node3 42613 1727204596.45907: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204596.45910: Calling groups_plugins_play to load vars for managed-node3 42613 1727204596.47045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204596.48247: done with get_vars() 42613 1727204596.48277: done getting variables 42613 1727204596.48326: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204596.48428: variable 'interface' from source: set_fact TASK [Get the IPv4 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:143 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.409) 0:00:25.092 ***** 42613 1727204596.48453: entering _queue_task() for managed-node3/command 42613 1727204596.48738: worker is 1 (out of 1 available) 42613 1727204596.48755: exiting _queue_task() for managed-node3/command 42613 1727204596.48768: done queuing things up, now waiting for results queue to drain 42613 1727204596.48770: waiting for pending results... 42613 1727204596.48978: running TaskExecutor() for managed-node3/TASK: Get the IPv4 routing rule for the connection "ethtest0" 42613 1727204596.49048: in run() - task 127b8e07-fff9-2f91-05d8-000000000060 42613 1727204596.49060: variable 'ansible_search_path' from source: unknown 42613 1727204596.49098: calling self._execute() 42613 1727204596.49184: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.49190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.49199: variable 'omit' from source: magic vars 42613 1727204596.49530: variable 'ansible_distribution_major_version' from source: facts 42613 1727204596.49546: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204596.49552: variable 'omit' from source: magic vars 42613 1727204596.49575: variable 'omit' from source: magic vars 42613 1727204596.49655: variable 'interface' from source: set_fact 42613 1727204596.49676: variable 'omit' from source: magic vars 42613 1727204596.49714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204596.49747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204596.49768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204596.49786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204596.49796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204596.49822: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204596.49825: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.49828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.49915: Set connection var ansible_shell_executable to /bin/sh 42613 1727204596.49919: Set connection var ansible_pipelining to False 42613 1727204596.49927: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204596.49929: Set connection var ansible_connection to ssh 42613 1727204596.49935: Set connection var ansible_timeout to 10 42613 1727204596.49937: Set connection var ansible_shell_type to sh 42613 1727204596.49960: variable 'ansible_shell_executable' from source: unknown 42613 1727204596.49963: variable 'ansible_connection' from source: unknown 42613 1727204596.49968: variable 'ansible_module_compression' from source: unknown 42613 1727204596.49970: variable 'ansible_shell_type' from source: unknown 42613 1727204596.49973: variable 'ansible_shell_executable' from source: unknown 42613 1727204596.49975: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.49979: variable 'ansible_pipelining' from source: unknown 42613 1727204596.49981: variable 'ansible_timeout' from source: unknown 42613 1727204596.49991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.50111: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204596.50123: variable 'omit' from source: magic vars 42613 1727204596.50128: starting attempt loop 42613 1727204596.50131: running the handler 42613 1727204596.50147: _low_level_execute_command(): starting 42613 1727204596.50155: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204596.50734: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.50743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.50747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.50788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.50801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.50888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.52727: stdout chunk (state=3): >>>/root <<< 42613 1727204596.52831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.52894: stderr chunk (state=3): >>><<< 42613 1727204596.52898: stdout chunk (state=3): >>><<< 42613 1727204596.52925: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.52939: _low_level_execute_command(): starting 42613 1727204596.52947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339 `" && echo ansible-tmp-1727204596.5292437-44549-165083043192339="` echo /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339 `" ) && sleep 0' 42613 1727204596.53454: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.53461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.53463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.53488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.53522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.53525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.53528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.53609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.55771: stdout chunk (state=3): >>>ansible-tmp-1727204596.5292437-44549-165083043192339=/root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339 <<< 42613 1727204596.55886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.55959: stderr chunk (state=3): >>><<< 42613 1727204596.55963: stdout chunk (state=3): >>><<< 42613 1727204596.55975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204596.5292437-44549-165083043192339=/root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.56005: variable 'ansible_module_compression' from source: unknown 42613 1727204596.56051: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204596.56090: variable 'ansible_facts' from source: unknown 42613 1727204596.56141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/AnsiballZ_command.py 42613 1727204596.56258: Sending initial data 42613 1727204596.56262: Sent initial data (156 bytes) 42613 1727204596.56770: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204596.56774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204596.56776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204596.56779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204596.56781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.56845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.56849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.56852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.56914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.58695: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204596.58762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204596.58828: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp2k4no2rn /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/AnsiballZ_command.py <<< 42613 1727204596.58831: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/AnsiballZ_command.py" <<< 42613 1727204596.58894: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp2k4no2rn" to remote "/root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/AnsiballZ_command.py" <<< 42613 1727204596.58898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/AnsiballZ_command.py" <<< 42613 1727204596.59556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.59639: stderr chunk (state=3): >>><<< 42613 1727204596.59644: stdout chunk (state=3): >>><<< 42613 1727204596.59662: done transferring module to remote 42613 1727204596.59679: _low_level_execute_command(): starting 42613 1727204596.59682: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/ /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/AnsiballZ_command.py && sleep 0' 42613 1727204596.60172: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204596.60176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204596.60179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.60181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.60193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.60247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.60250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.60253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.60330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.62355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.62414: stderr chunk (state=3): >>><<< 42613 1727204596.62417: stdout chunk (state=3): >>><<< 42613 1727204596.62433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.62437: _low_level_execute_command(): starting 42613 1727204596.62444: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/AnsiballZ_command.py && sleep 0' 42613 1727204596.62951: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.62955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204596.62958: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204596.62960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.63007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.63010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.63013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.63097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.82616: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:03:16.804597", "end": "2024-09-24 15:03:16.824744", "delta": "0:00:00.020147", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204596.84580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204596.84585: stdout chunk (state=3): >>><<< 42613 1727204596.84588: stderr chunk (state=3): >>><<< 42613 1727204596.84591: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:03:16.804597", "end": "2024-09-24 15:03:16.824744", "delta": "0:00:00.020147", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204596.84635: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv4.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204596.84661: _low_level_execute_command(): starting 42613 1727204596.84675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204596.5292437-44549-165083043192339/ > /dev/null 2>&1 && sleep 0' 42613 1727204596.85310: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.85325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.85393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204596.85402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.85404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.85477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.87602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.87606: stdout chunk (state=3): >>><<< 42613 1727204596.87608: stderr chunk (state=3): >>><<< 42613 1727204596.87626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.87681: handler run complete 42613 1727204596.87684: Evaluated conditional (False): False 42613 1727204596.87686: attempt loop complete, returning result 42613 1727204596.87699: _execute() done 42613 1727204596.87706: dumping result to json 42613 1727204596.87715: done dumping result, returning 42613 1727204596.87728: done running TaskExecutor() for managed-node3/TASK: Get the IPv4 routing rule for the connection "ethtest0" [127b8e07-fff9-2f91-05d8-000000000060] 42613 1727204596.87740: sending task result for task 127b8e07-fff9-2f91-05d8-000000000060 42613 1727204596.87945: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000060 42613 1727204596.87948: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.020147", "end": "2024-09-24 15:03:16.824744", "rc": 0, "start": "2024-09-24 15:03:16.804597" } STDOUT: ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200 42613 1727204596.88051: no more pending results, returning what we have 42613 1727204596.88055: results queue empty 42613 1727204596.88056: checking for any_errors_fatal 42613 1727204596.88070: done checking for any_errors_fatal 42613 1727204596.88070: checking for max_fail_percentage 42613 1727204596.88074: done checking for max_fail_percentage 42613 1727204596.88074: checking to see if all hosts have failed and the running result is not ok 42613 1727204596.88075: done checking to see if all hosts have failed 42613 1727204596.88076: getting the remaining hosts for this loop 42613 1727204596.88078: done getting the remaining hosts for this loop 42613 1727204596.88082: getting the next task for host managed-node3 42613 1727204596.88090: done getting next task for host managed-node3 42613 1727204596.88093: ^ task is: TASK: Get the IPv6 routing rule for the connection "{{ interface }}" 42613 1727204596.88096: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204596.88101: getting variables 42613 1727204596.88102: in VariableManager get_vars() 42613 1727204596.88148: Calling all_inventory to load vars for managed-node3 42613 1727204596.88151: Calling groups_inventory to load vars for managed-node3 42613 1727204596.88154: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204596.88371: Calling all_plugins_play to load vars for managed-node3 42613 1727204596.88472: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204596.88479: Calling groups_plugins_play to load vars for managed-node3 42613 1727204596.90375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204596.92809: done with get_vars() 42613 1727204596.92856: done getting variables 42613 1727204596.92924: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204596.93076: variable 'interface' from source: set_fact TASK [Get the IPv6 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:149 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.446) 0:00:25.539 ***** 42613 1727204596.93106: entering _queue_task() for managed-node3/command 42613 1727204596.93696: worker is 1 (out of 1 available) 42613 1727204596.93708: exiting _queue_task() for managed-node3/command 42613 1727204596.93719: done queuing things up, now waiting for results queue to drain 42613 1727204596.93720: waiting for pending results... 42613 1727204596.93971: running TaskExecutor() for managed-node3/TASK: Get the IPv6 routing rule for the connection "ethtest0" 42613 1727204596.94059: in run() - task 127b8e07-fff9-2f91-05d8-000000000061 42613 1727204596.94171: variable 'ansible_search_path' from source: unknown 42613 1727204596.94175: calling self._execute() 42613 1727204596.94233: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.94250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.94274: variable 'omit' from source: magic vars 42613 1727204596.94739: variable 'ansible_distribution_major_version' from source: facts 42613 1727204596.94760: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204596.94774: variable 'omit' from source: magic vars 42613 1727204596.94801: variable 'omit' from source: magic vars 42613 1727204596.94944: variable 'interface' from source: set_fact 42613 1727204596.94972: variable 'omit' from source: magic vars 42613 1727204596.95026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204596.95085: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204596.95111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204596.95149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204596.95171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204596.95262: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204596.95265: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.95268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.95382: Set connection var ansible_shell_executable to /bin/sh 42613 1727204596.95398: Set connection var ansible_pipelining to False 42613 1727204596.95475: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204596.95478: Set connection var ansible_connection to ssh 42613 1727204596.95481: Set connection var ansible_timeout to 10 42613 1727204596.95483: Set connection var ansible_shell_type to sh 42613 1727204596.95485: variable 'ansible_shell_executable' from source: unknown 42613 1727204596.95487: variable 'ansible_connection' from source: unknown 42613 1727204596.95489: variable 'ansible_module_compression' from source: unknown 42613 1727204596.95491: variable 'ansible_shell_type' from source: unknown 42613 1727204596.95495: variable 'ansible_shell_executable' from source: unknown 42613 1727204596.95497: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204596.95498: variable 'ansible_pipelining' from source: unknown 42613 1727204596.95501: variable 'ansible_timeout' from source: unknown 42613 1727204596.95508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204596.95696: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204596.95771: variable 'omit' from source: magic vars 42613 1727204596.95774: starting attempt loop 42613 1727204596.95776: running the handler 42613 1727204596.95778: _low_level_execute_command(): starting 42613 1727204596.95781: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204596.96779: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.96829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204596.96872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204596.96884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204596.96994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204596.98840: stdout chunk (state=3): >>>/root <<< 42613 1727204596.99043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204596.99047: stdout chunk (state=3): >>><<< 42613 1727204596.99050: stderr chunk (state=3): >>><<< 42613 1727204596.99192: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204596.99196: _low_level_execute_command(): starting 42613 1727204596.99199: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446 `" && echo ansible-tmp-1727204596.9908311-44565-250425085579446="` echo /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446 `" ) && sleep 0' 42613 1727204596.99847: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204596.99973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204596.99987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204597.00021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204597.00117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204597.02326: stdout chunk (state=3): >>>ansible-tmp-1727204596.9908311-44565-250425085579446=/root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446 <<< 42613 1727204597.02547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204597.02551: stdout chunk (state=3): >>><<< 42613 1727204597.02554: stderr chunk (state=3): >>><<< 42613 1727204597.02579: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204596.9908311-44565-250425085579446=/root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204597.02773: variable 'ansible_module_compression' from source: unknown 42613 1727204597.02777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204597.02779: variable 'ansible_facts' from source: unknown 42613 1727204597.02798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/AnsiballZ_command.py 42613 1727204597.03010: Sending initial data 42613 1727204597.03018: Sent initial data (156 bytes) 42613 1727204597.03682: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204597.03703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204597.03722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204597.03764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204597.03779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204597.03794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204597.03881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204597.03902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204597.03921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204597.03948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204597.04056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204597.05874: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204597.05969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204597.06072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpplg03vg0 /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/AnsiballZ_command.py <<< 42613 1727204597.06085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/AnsiballZ_command.py" <<< 42613 1727204597.06146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpplg03vg0" to remote "/root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/AnsiballZ_command.py" <<< 42613 1727204597.07063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204597.07240: stderr chunk (state=3): >>><<< 42613 1727204597.07243: stdout chunk (state=3): >>><<< 42613 1727204597.07246: done transferring module to remote 42613 1727204597.07248: _low_level_execute_command(): starting 42613 1727204597.07251: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/ /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/AnsiballZ_command.py && sleep 0' 42613 1727204597.08045: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204597.08050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204597.08084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204597.08200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204597.10292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204597.10330: stderr chunk (state=3): >>><<< 42613 1727204597.10351: stdout chunk (state=3): >>><<< 42613 1727204597.10379: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204597.10388: _low_level_execute_command(): starting 42613 1727204597.10398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/AnsiballZ_command.py && sleep 0' 42613 1727204597.11068: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204597.11085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204597.11100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204597.11117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204597.11133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204597.11149: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204597.11162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204597.11182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204597.11286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204597.11298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204597.11417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204597.30902: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:03:17.288921", "end": "2024-09-24 15:03:17.307393", "delta": "0:00:00.018472", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204597.32832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204597.32837: stdout chunk (state=3): >>><<< 42613 1727204597.32884: stderr chunk (state=3): >>><<< 42613 1727204597.32992: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:03:17.288921", "end": "2024-09-24 15:03:17.307393", "delta": "0:00:00.018472", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204597.33033: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv6.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204597.33045: _low_level_execute_command(): starting 42613 1727204597.33051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204596.9908311-44565-250425085579446/ > /dev/null 2>&1 && sleep 0' 42613 1727204597.34574: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204597.34579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204597.34759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204597.34771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204597.35273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204597.35277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204597.35325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204597.35428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204597.37489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204597.37601: stderr chunk (state=3): >>><<< 42613 1727204597.37605: stdout chunk (state=3): >>><<< 42613 1727204597.37687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204597.37696: handler run complete 42613 1727204597.37730: Evaluated conditional (False): False 42613 1727204597.37745: attempt loop complete, returning result 42613 1727204597.37748: _execute() done 42613 1727204597.37751: dumping result to json 42613 1727204597.37757: done dumping result, returning 42613 1727204597.37772: done running TaskExecutor() for managed-node3/TASK: Get the IPv6 routing rule for the connection "ethtest0" [127b8e07-fff9-2f91-05d8-000000000061] 42613 1727204597.37775: sending task result for task 127b8e07-fff9-2f91-05d8-000000000061 42613 1727204597.38189: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000061 42613 1727204597.38192: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.018472", "end": "2024-09-24 15:03:17.307393", "rc": 0, "start": "2024-09-24 15:03:17.288921" } STDOUT: ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600 42613 1727204597.38363: no more pending results, returning what we have 42613 1727204597.38371: results queue empty 42613 1727204597.38372: checking for any_errors_fatal 42613 1727204597.38382: done checking for any_errors_fatal 42613 1727204597.38383: checking for max_fail_percentage 42613 1727204597.38387: done checking for max_fail_percentage 42613 1727204597.38388: checking to see if all hosts have failed and the running result is not ok 42613 1727204597.38389: done checking to see if all hosts have failed 42613 1727204597.38389: getting the remaining hosts for this loop 42613 1727204597.38391: done getting the remaining hosts for this loop 42613 1727204597.38396: getting the next task for host managed-node3 42613 1727204597.38404: done getting next task for host managed-node3 42613 1727204597.38407: ^ task is: TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 42613 1727204597.38410: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204597.38414: getting variables 42613 1727204597.38416: in VariableManager get_vars() 42613 1727204597.38463: Calling all_inventory to load vars for managed-node3 42613 1727204597.38732: Calling groups_inventory to load vars for managed-node3 42613 1727204597.38736: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204597.38752: Calling all_plugins_play to load vars for managed-node3 42613 1727204597.38755: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204597.38759: Calling groups_plugins_play to load vars for managed-node3 42613 1727204597.41546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204597.44356: done with get_vars() 42613 1727204597.44401: done getting variables 42613 1727204597.44479: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30200 matches the specified rule] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:155 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.514) 0:00:26.053 ***** 42613 1727204597.44517: entering _queue_task() for managed-node3/assert 42613 1727204597.44912: worker is 1 (out of 1 available) 42613 1727204597.45040: exiting _queue_task() for managed-node3/assert 42613 1727204597.45052: done queuing things up, now waiting for results queue to drain 42613 1727204597.45054: waiting for pending results... 42613 1727204597.45291: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 42613 1727204597.45412: in run() - task 127b8e07-fff9-2f91-05d8-000000000062 42613 1727204597.45432: variable 'ansible_search_path' from source: unknown 42613 1727204597.45481: calling self._execute() 42613 1727204597.45625: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.45629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.45632: variable 'omit' from source: magic vars 42613 1727204597.46142: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.46187: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204597.46310: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.46346: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204597.46349: variable 'omit' from source: magic vars 42613 1727204597.46371: variable 'omit' from source: magic vars 42613 1727204597.46428: variable 'omit' from source: magic vars 42613 1727204597.46513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204597.46536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204597.46576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204597.46600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.46670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.46675: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204597.46678: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.46681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.46805: Set connection var ansible_shell_executable to /bin/sh 42613 1727204597.46817: Set connection var ansible_pipelining to False 42613 1727204597.46893: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204597.46896: Set connection var ansible_connection to ssh 42613 1727204597.46900: Set connection var ansible_timeout to 10 42613 1727204597.46902: Set connection var ansible_shell_type to sh 42613 1727204597.46904: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.46907: variable 'ansible_connection' from source: unknown 42613 1727204597.46909: variable 'ansible_module_compression' from source: unknown 42613 1727204597.46911: variable 'ansible_shell_type' from source: unknown 42613 1727204597.46914: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.46916: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.46918: variable 'ansible_pipelining' from source: unknown 42613 1727204597.47001: variable 'ansible_timeout' from source: unknown 42613 1727204597.47004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.47107: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204597.47126: variable 'omit' from source: magic vars 42613 1727204597.47138: starting attempt loop 42613 1727204597.47147: running the handler 42613 1727204597.47364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204597.47655: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204597.47764: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204597.47816: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204597.47861: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204597.47975: variable 'route_rule_table_30200' from source: set_fact 42613 1727204597.48024: Evaluated conditional (route_rule_table_30200.stdout is search("30200:(\s+)from 198.51.100.58/26 lookup 30200")): True 42613 1727204597.48204: variable 'route_rule_table_30200' from source: set_fact 42613 1727204597.48249: Evaluated conditional (route_rule_table_30200.stdout is search("30201:(\s+)from all fwmark 0x1/0x1 lookup 30200")): True 42613 1727204597.48422: variable 'route_rule_table_30200' from source: set_fact 42613 1727204597.48462: Evaluated conditional (route_rule_table_30200.stdout is search("30202:(\s+)from all ipproto tcp lookup 30200")): True 42613 1727204597.48637: variable 'route_rule_table_30200' from source: set_fact 42613 1727204597.48669: Evaluated conditional (route_rule_table_30200.stdout is search("30203:(\s+)from all sport 128-256 lookup 30200")): True 42613 1727204597.48854: variable 'route_rule_table_30200' from source: set_fact 42613 1727204597.48872: Evaluated conditional (route_rule_table_30200.stdout is search("30204:(\s+)from all tos (0x08|throughput) lookup 30200")): True 42613 1727204597.48890: handler run complete 42613 1727204597.48912: attempt loop complete, returning result 42613 1727204597.48963: _execute() done 42613 1727204597.48968: dumping result to json 42613 1727204597.48971: done dumping result, returning 42613 1727204597.48975: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule [127b8e07-fff9-2f91-05d8-000000000062] 42613 1727204597.48977: sending task result for task 127b8e07-fff9-2f91-05d8-000000000062 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204597.49223: no more pending results, returning what we have 42613 1727204597.49227: results queue empty 42613 1727204597.49228: checking for any_errors_fatal 42613 1727204597.49235: done checking for any_errors_fatal 42613 1727204597.49236: checking for max_fail_percentage 42613 1727204597.49239: done checking for max_fail_percentage 42613 1727204597.49240: checking to see if all hosts have failed and the running result is not ok 42613 1727204597.49241: done checking to see if all hosts have failed 42613 1727204597.49242: getting the remaining hosts for this loop 42613 1727204597.49243: done getting the remaining hosts for this loop 42613 1727204597.49248: getting the next task for host managed-node3 42613 1727204597.49257: done getting next task for host managed-node3 42613 1727204597.49260: ^ task is: TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 42613 1727204597.49262: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204597.49267: getting variables 42613 1727204597.49269: in VariableManager get_vars() 42613 1727204597.49316: Calling all_inventory to load vars for managed-node3 42613 1727204597.49319: Calling groups_inventory to load vars for managed-node3 42613 1727204597.49321: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204597.49336: Calling all_plugins_play to load vars for managed-node3 42613 1727204597.49339: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204597.49350: Calling groups_plugins_play to load vars for managed-node3 42613 1727204597.49986: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000062 42613 1727204597.49991: WORKER PROCESS EXITING 42613 1727204597.53298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204597.57994: done with get_vars() 42613 1727204597.58039: done getting variables 42613 1727204597.58222: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30400 matches the specified rule] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:166 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.137) 0:00:26.190 ***** 42613 1727204597.58254: entering _queue_task() for managed-node3/assert 42613 1727204597.59026: worker is 1 (out of 1 available) 42613 1727204597.59157: exiting _queue_task() for managed-node3/assert 42613 1727204597.59173: done queuing things up, now waiting for results queue to drain 42613 1727204597.59175: waiting for pending results... 42613 1727204597.59566: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 42613 1727204597.59880: in run() - task 127b8e07-fff9-2f91-05d8-000000000063 42613 1727204597.59904: variable 'ansible_search_path' from source: unknown 42613 1727204597.59972: calling self._execute() 42613 1727204597.60553: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.60557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.60560: variable 'omit' from source: magic vars 42613 1727204597.61974: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.61979: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204597.61982: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.62279: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204597.62296: variable 'omit' from source: magic vars 42613 1727204597.62329: variable 'omit' from source: magic vars 42613 1727204597.62770: variable 'omit' from source: magic vars 42613 1727204597.62775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204597.62778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204597.62780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204597.62782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.62785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.62820: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204597.63271: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.63275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.63277: Set connection var ansible_shell_executable to /bin/sh 42613 1727204597.63280: Set connection var ansible_pipelining to False 42613 1727204597.63282: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204597.63284: Set connection var ansible_connection to ssh 42613 1727204597.63286: Set connection var ansible_timeout to 10 42613 1727204597.63288: Set connection var ansible_shell_type to sh 42613 1727204597.63290: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.63293: variable 'ansible_connection' from source: unknown 42613 1727204597.63295: variable 'ansible_module_compression' from source: unknown 42613 1727204597.63297: variable 'ansible_shell_type' from source: unknown 42613 1727204597.63301: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.63304: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.63310: variable 'ansible_pipelining' from source: unknown 42613 1727204597.63319: variable 'ansible_timeout' from source: unknown 42613 1727204597.63328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.63711: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204597.63731: variable 'omit' from source: magic vars 42613 1727204597.63747: starting attempt loop 42613 1727204597.63756: running the handler 42613 1727204597.64168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204597.64653: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204597.64927: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204597.65031: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204597.65214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204597.65323: variable 'route_rule_table_30400' from source: set_fact 42613 1727204597.65412: Evaluated conditional (route_rule_table_30400.stdout is search("30400:(\s+)from all to 198.51.100.128/26 lookup 30400")): True 42613 1727204597.65745: variable 'route_rule_table_30400' from source: set_fact 42613 1727204597.66171: Evaluated conditional (route_rule_table_30400.stdout is search("30401:(\s+)from all iif iiftest \[detached\] lookup 30400")): True 42613 1727204597.66571: variable 'route_rule_table_30400' from source: set_fact 42613 1727204597.66574: Evaluated conditional (route_rule_table_30400.stdout is search("30402:(\s+)from all oif oiftest \[detached\] lookup 30400")): True 42613 1727204597.66577: handler run complete 42613 1727204597.66580: attempt loop complete, returning result 42613 1727204597.66583: _execute() done 42613 1727204597.66586: dumping result to json 42613 1727204597.66589: done dumping result, returning 42613 1727204597.66596: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule [127b8e07-fff9-2f91-05d8-000000000063] 42613 1727204597.66599: sending task result for task 127b8e07-fff9-2f91-05d8-000000000063 42613 1727204597.66684: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000063 42613 1727204597.66687: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204597.66753: no more pending results, returning what we have 42613 1727204597.66757: results queue empty 42613 1727204597.66758: checking for any_errors_fatal 42613 1727204597.66770: done checking for any_errors_fatal 42613 1727204597.66771: checking for max_fail_percentage 42613 1727204597.66774: done checking for max_fail_percentage 42613 1727204597.66775: checking to see if all hosts have failed and the running result is not ok 42613 1727204597.66776: done checking to see if all hosts have failed 42613 1727204597.66776: getting the remaining hosts for this loop 42613 1727204597.66778: done getting the remaining hosts for this loop 42613 1727204597.66783: getting the next task for host managed-node3 42613 1727204597.66792: done getting next task for host managed-node3 42613 1727204597.66796: ^ task is: TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 42613 1727204597.66798: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204597.66802: getting variables 42613 1727204597.66804: in VariableManager get_vars() 42613 1727204597.66844: Calling all_inventory to load vars for managed-node3 42613 1727204597.66847: Calling groups_inventory to load vars for managed-node3 42613 1727204597.66850: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204597.66864: Calling all_plugins_play to load vars for managed-node3 42613 1727204597.67085: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204597.67091: Calling groups_plugins_play to load vars for managed-node3 42613 1727204597.70764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204597.76432: done with get_vars() 42613 1727204597.76589: done getting variables 42613 1727204597.76657: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30600 matches the specified rule] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:175 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.185) 0:00:26.376 ***** 42613 1727204597.76862: entering _queue_task() for managed-node3/assert 42613 1727204597.77612: worker is 1 (out of 1 available) 42613 1727204597.77631: exiting _queue_task() for managed-node3/assert 42613 1727204597.77645: done queuing things up, now waiting for results queue to drain 42613 1727204597.77647: waiting for pending results... 42613 1727204597.78259: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 42613 1727204597.78772: in run() - task 127b8e07-fff9-2f91-05d8-000000000064 42613 1727204597.78777: variable 'ansible_search_path' from source: unknown 42613 1727204597.78780: calling self._execute() 42613 1727204597.78783: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.78786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.78983: variable 'omit' from source: magic vars 42613 1727204597.79831: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.79857: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204597.80371: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.80375: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204597.80378: variable 'omit' from source: magic vars 42613 1727204597.80381: variable 'omit' from source: magic vars 42613 1727204597.80384: variable 'omit' from source: magic vars 42613 1727204597.80386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204597.80393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204597.80597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204597.80623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.80646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.80688: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204597.80697: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.80705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.80831: Set connection var ansible_shell_executable to /bin/sh 42613 1727204597.81081: Set connection var ansible_pipelining to False 42613 1727204597.81096: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204597.81104: Set connection var ansible_connection to ssh 42613 1727204597.81115: Set connection var ansible_timeout to 10 42613 1727204597.81122: Set connection var ansible_shell_type to sh 42613 1727204597.81158: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.81170: variable 'ansible_connection' from source: unknown 42613 1727204597.81178: variable 'ansible_module_compression' from source: unknown 42613 1727204597.81186: variable 'ansible_shell_type' from source: unknown 42613 1727204597.81193: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.81200: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.81208: variable 'ansible_pipelining' from source: unknown 42613 1727204597.81215: variable 'ansible_timeout' from source: unknown 42613 1727204597.81224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.81396: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204597.81587: variable 'omit' from source: magic vars 42613 1727204597.81599: starting attempt loop 42613 1727204597.81607: running the handler 42613 1727204597.82025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204597.82512: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204597.82830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204597.83873: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204597.83877: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204597.83885: variable 'route_rule_table_30600' from source: set_fact 42613 1727204597.83887: Evaluated conditional (route_rule_table_30600.stdout is search("30600:(\s+)from all to 2001:db8::4/32 lookup 30600")): True 42613 1727204597.84245: variable 'route_rule_table_30600' from source: set_fact 42613 1727204597.84507: Evaluated conditional (route_rule_table_30600.stdout is search("30601:(\s+)not from all dport 128-256 lookup 30600")): True 42613 1727204597.84523: handler run complete 42613 1727204597.84549: attempt loop complete, returning result 42613 1727204597.84557: _execute() done 42613 1727204597.84564: dumping result to json 42613 1727204597.84575: done dumping result, returning 42613 1727204597.84590: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule [127b8e07-fff9-2f91-05d8-000000000064] 42613 1727204597.84601: sending task result for task 127b8e07-fff9-2f91-05d8-000000000064 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204597.84784: no more pending results, returning what we have 42613 1727204597.84788: results queue empty 42613 1727204597.84789: checking for any_errors_fatal 42613 1727204597.84798: done checking for any_errors_fatal 42613 1727204597.84799: checking for max_fail_percentage 42613 1727204597.84802: done checking for max_fail_percentage 42613 1727204597.84803: checking to see if all hosts have failed and the running result is not ok 42613 1727204597.84804: done checking to see if all hosts have failed 42613 1727204597.84805: getting the remaining hosts for this loop 42613 1727204597.84806: done getting the remaining hosts for this loop 42613 1727204597.84811: getting the next task for host managed-node3 42613 1727204597.84819: done getting next task for host managed-node3 42613 1727204597.84823: ^ task is: TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 42613 1727204597.84826: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204597.84830: getting variables 42613 1727204597.84832: in VariableManager get_vars() 42613 1727204597.84881: Calling all_inventory to load vars for managed-node3 42613 1727204597.84884: Calling groups_inventory to load vars for managed-node3 42613 1727204597.84887: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204597.84896: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000064 42613 1727204597.84899: WORKER PROCESS EXITING 42613 1727204597.84980: Calling all_plugins_play to load vars for managed-node3 42613 1727204597.84984: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204597.84987: Calling groups_plugins_play to load vars for managed-node3 42613 1727204597.89175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204597.94456: done with get_vars() 42613 1727204597.94723: done getting variables 42613 1727204597.94956: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with 'custom' table lookup matches the specified rule] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:183 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.182) 0:00:26.558 ***** 42613 1727204597.95060: entering _queue_task() for managed-node3/assert 42613 1727204597.95968: worker is 1 (out of 1 available) 42613 1727204597.95987: exiting _queue_task() for managed-node3/assert 42613 1727204597.96116: done queuing things up, now waiting for results queue to drain 42613 1727204597.96118: waiting for pending results... 42613 1727204597.96487: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 42613 1727204597.96783: in run() - task 127b8e07-fff9-2f91-05d8-000000000065 42613 1727204597.96806: variable 'ansible_search_path' from source: unknown 42613 1727204597.96855: calling self._execute() 42613 1727204597.97372: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.97377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.97380: variable 'omit' from source: magic vars 42613 1727204597.98023: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.98046: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204597.98409: variable 'ansible_distribution_major_version' from source: facts 42613 1727204597.98422: Evaluated conditional (ansible_distribution_major_version != "7"): True 42613 1727204597.98433: variable 'omit' from source: magic vars 42613 1727204597.98464: variable 'omit' from source: magic vars 42613 1727204597.98514: variable 'omit' from source: magic vars 42613 1727204597.99071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204597.99076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204597.99078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204597.99080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.99083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204597.99085: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204597.99087: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.99088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.99279: Set connection var ansible_shell_executable to /bin/sh 42613 1727204597.99291: Set connection var ansible_pipelining to False 42613 1727204597.99304: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204597.99310: Set connection var ansible_connection to ssh 42613 1727204597.99320: Set connection var ansible_timeout to 10 42613 1727204597.99326: Set connection var ansible_shell_type to sh 42613 1727204597.99359: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.99370: variable 'ansible_connection' from source: unknown 42613 1727204597.99377: variable 'ansible_module_compression' from source: unknown 42613 1727204597.99385: variable 'ansible_shell_type' from source: unknown 42613 1727204597.99392: variable 'ansible_shell_executable' from source: unknown 42613 1727204597.99398: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204597.99406: variable 'ansible_pipelining' from source: unknown 42613 1727204597.99412: variable 'ansible_timeout' from source: unknown 42613 1727204597.99420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204597.99797: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204597.99819: variable 'omit' from source: magic vars 42613 1727204597.99831: starting attempt loop 42613 1727204597.99841: running the handler 42613 1727204598.00268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204598.00753: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204598.01172: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204598.01176: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204598.01178: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204598.01450: variable 'route_rule_table_custom' from source: set_fact 42613 1727204598.01495: Evaluated conditional (route_rule_table_custom.stdout is search("200:(\s+)from 198.51.100.56/26 lookup custom")): True 42613 1727204598.01587: handler run complete 42613 1727204598.01610: attempt loop complete, returning result 42613 1727204598.01618: _execute() done 42613 1727204598.01626: dumping result to json 42613 1727204598.01633: done dumping result, returning 42613 1727204598.01673: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule [127b8e07-fff9-2f91-05d8-000000000065] 42613 1727204598.01717: sending task result for task 127b8e07-fff9-2f91-05d8-000000000065 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204598.01940: no more pending results, returning what we have 42613 1727204598.01944: results queue empty 42613 1727204598.01946: checking for any_errors_fatal 42613 1727204598.01957: done checking for any_errors_fatal 42613 1727204598.01959: checking for max_fail_percentage 42613 1727204598.01963: done checking for max_fail_percentage 42613 1727204598.01964: checking to see if all hosts have failed and the running result is not ok 42613 1727204598.01967: done checking to see if all hosts have failed 42613 1727204598.01967: getting the remaining hosts for this loop 42613 1727204598.01969: done getting the remaining hosts for this loop 42613 1727204598.01974: getting the next task for host managed-node3 42613 1727204598.01983: done getting next task for host managed-node3 42613 1727204598.01986: ^ task is: TASK: Assert that the specified IPv4 routing rule was configured in the connection "{{ interface }}" 42613 1727204598.01989: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204598.01992: getting variables 42613 1727204598.01994: in VariableManager get_vars() 42613 1727204598.02037: Calling all_inventory to load vars for managed-node3 42613 1727204598.02040: Calling groups_inventory to load vars for managed-node3 42613 1727204598.02042: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204598.02055: Calling all_plugins_play to load vars for managed-node3 42613 1727204598.02058: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204598.02060: Calling groups_plugins_play to load vars for managed-node3 42613 1727204598.02963: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000065 42613 1727204598.03576: WORKER PROCESS EXITING 42613 1727204598.05332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204598.09072: done with get_vars() 42613 1727204598.09112: done getting variables 42613 1727204598.09209: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204598.09637: variable 'interface' from source: set_fact TASK [Assert that the specified IPv4 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:190 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.146) 0:00:26.704 ***** 42613 1727204598.09702: entering _queue_task() for managed-node3/assert 42613 1727204598.10099: worker is 1 (out of 1 available) 42613 1727204598.10228: exiting _queue_task() for managed-node3/assert 42613 1727204598.10239: done queuing things up, now waiting for results queue to drain 42613 1727204598.10241: waiting for pending results... 42613 1727204598.10464: running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" 42613 1727204598.10570: in run() - task 127b8e07-fff9-2f91-05d8-000000000066 42613 1727204598.10775: variable 'ansible_search_path' from source: unknown 42613 1727204598.10780: calling self._execute() 42613 1727204598.10783: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.10786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.10789: variable 'omit' from source: magic vars 42613 1727204598.11234: variable 'ansible_distribution_major_version' from source: facts 42613 1727204598.11248: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204598.11256: variable 'omit' from source: magic vars 42613 1727204598.11285: variable 'omit' from source: magic vars 42613 1727204598.11429: variable 'interface' from source: set_fact 42613 1727204598.11457: variable 'omit' from source: magic vars 42613 1727204598.11507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204598.11617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204598.11879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204598.11883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204598.11885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204598.11888: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204598.11890: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.11893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.11895: Set connection var ansible_shell_executable to /bin/sh 42613 1727204598.11897: Set connection var ansible_pipelining to False 42613 1727204598.11899: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204598.11902: Set connection var ansible_connection to ssh 42613 1727204598.11904: Set connection var ansible_timeout to 10 42613 1727204598.11906: Set connection var ansible_shell_type to sh 42613 1727204598.11908: variable 'ansible_shell_executable' from source: unknown 42613 1727204598.11910: variable 'ansible_connection' from source: unknown 42613 1727204598.11913: variable 'ansible_module_compression' from source: unknown 42613 1727204598.11915: variable 'ansible_shell_type' from source: unknown 42613 1727204598.11917: variable 'ansible_shell_executable' from source: unknown 42613 1727204598.11919: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.11921: variable 'ansible_pipelining' from source: unknown 42613 1727204598.11923: variable 'ansible_timeout' from source: unknown 42613 1727204598.11926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.12140: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204598.12149: variable 'omit' from source: magic vars 42613 1727204598.12156: starting attempt loop 42613 1727204598.12159: running the handler 42613 1727204598.12383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204598.12673: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204598.12719: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204598.12811: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204598.12854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204598.12956: variable 'connection_route_rule' from source: set_fact 42613 1727204598.12992: Evaluated conditional (connection_route_rule.stdout is search("priority 30200 from 198.51.100.58/26 table 30200")): True 42613 1727204598.13156: variable 'connection_route_rule' from source: set_fact 42613 1727204598.13191: Evaluated conditional (connection_route_rule.stdout is search("priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200")): True 42613 1727204598.13343: variable 'connection_route_rule' from source: set_fact 42613 1727204598.13369: Evaluated conditional (connection_route_rule.stdout is search("priority 30202 from 0.0.0.0/0 ipproto 6 table 30200")): True 42613 1727204598.13525: variable 'connection_route_rule' from source: set_fact 42613 1727204598.13549: Evaluated conditional (connection_route_rule.stdout is search("priority 30203 from 0.0.0.0/0 sport 128-256 table 30200")): True 42613 1727204598.13697: variable 'connection_route_rule' from source: set_fact 42613 1727204598.13729: Evaluated conditional (connection_route_rule.stdout is search("priority 30204 from 0.0.0.0/0 tos 0x08 table 30200")): True 42613 1727204598.13872: variable 'connection_route_rule' from source: set_fact 42613 1727204598.13896: Evaluated conditional (connection_route_rule.stdout is search("priority 30400 to 198.51.100.128/26 table 30400")): True 42613 1727204598.14197: variable 'connection_route_rule' from source: set_fact 42613 1727204598.14222: Evaluated conditional (connection_route_rule.stdout is search("priority 30401 from 0.0.0.0/0 iif iiftest table 30400")): True 42613 1727204598.14573: variable 'connection_route_rule' from source: set_fact 42613 1727204598.14577: Evaluated conditional (connection_route_rule.stdout is search("priority 30402 from 0.0.0.0/0 oif oiftest table 30400")): True 42613 1727204598.14771: variable 'connection_route_rule' from source: set_fact 42613 1727204598.14775: Evaluated conditional (connection_route_rule.stdout is search("priority 30403 from 0.0.0.0/0 table 30400")): True 42613 1727204598.14777: variable 'connection_route_rule' from source: set_fact 42613 1727204598.14800: Evaluated conditional (connection_route_rule.stdout is search("priority 200 from 198.51.100.56/26 table 200")): True 42613 1727204598.14804: handler run complete 42613 1727204598.14821: attempt loop complete, returning result 42613 1727204598.14824: _execute() done 42613 1727204598.14827: dumping result to json 42613 1727204598.14829: done dumping result, returning 42613 1727204598.14842: done running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" [127b8e07-fff9-2f91-05d8-000000000066] 42613 1727204598.14851: sending task result for task 127b8e07-fff9-2f91-05d8-000000000066 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204598.15019: no more pending results, returning what we have 42613 1727204598.15023: results queue empty 42613 1727204598.15024: checking for any_errors_fatal 42613 1727204598.15034: done checking for any_errors_fatal 42613 1727204598.15035: checking for max_fail_percentage 42613 1727204598.15038: done checking for max_fail_percentage 42613 1727204598.15038: checking to see if all hosts have failed and the running result is not ok 42613 1727204598.15039: done checking to see if all hosts have failed 42613 1727204598.15040: getting the remaining hosts for this loop 42613 1727204598.15042: done getting the remaining hosts for this loop 42613 1727204598.15055: getting the next task for host managed-node3 42613 1727204598.15064: done getting next task for host managed-node3 42613 1727204598.15174: ^ task is: TASK: Assert that the specified IPv6 routing rule was configured in the connection "{{ interface }}" 42613 1727204598.15177: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204598.15181: getting variables 42613 1727204598.15183: in VariableManager get_vars() 42613 1727204598.15225: Calling all_inventory to load vars for managed-node3 42613 1727204598.15228: Calling groups_inventory to load vars for managed-node3 42613 1727204598.15231: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204598.15244: Calling all_plugins_play to load vars for managed-node3 42613 1727204598.15247: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204598.15250: Calling groups_plugins_play to load vars for managed-node3 42613 1727204598.15941: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000066 42613 1727204598.15947: WORKER PROCESS EXITING 42613 1727204598.17289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204598.19710: done with get_vars() 42613 1727204598.19767: done getting variables 42613 1727204598.19845: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204598.20013: variable 'interface' from source: set_fact TASK [Assert that the specified IPv6 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:205 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.103) 0:00:26.808 ***** 42613 1727204598.20050: entering _queue_task() for managed-node3/assert 42613 1727204598.20814: worker is 1 (out of 1 available) 42613 1727204598.20830: exiting _queue_task() for managed-node3/assert 42613 1727204598.20847: done queuing things up, now waiting for results queue to drain 42613 1727204598.20848: waiting for pending results... 42613 1727204598.21174: running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" 42613 1727204598.21299: in run() - task 127b8e07-fff9-2f91-05d8-000000000067 42613 1727204598.21322: variable 'ansible_search_path' from source: unknown 42613 1727204598.21373: calling self._execute() 42613 1727204598.21493: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.21510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.21526: variable 'omit' from source: magic vars 42613 1727204598.21985: variable 'ansible_distribution_major_version' from source: facts 42613 1727204598.22005: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204598.22017: variable 'omit' from source: magic vars 42613 1727204598.22056: variable 'omit' from source: magic vars 42613 1727204598.22273: variable 'interface' from source: set_fact 42613 1727204598.22276: variable 'omit' from source: magic vars 42613 1727204598.22280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204598.22308: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204598.22336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204598.22363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204598.22385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204598.22424: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204598.22432: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.22443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.22569: Set connection var ansible_shell_executable to /bin/sh 42613 1727204598.22583: Set connection var ansible_pipelining to False 42613 1727204598.22599: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204598.22606: Set connection var ansible_connection to ssh 42613 1727204598.22615: Set connection var ansible_timeout to 10 42613 1727204598.22621: Set connection var ansible_shell_type to sh 42613 1727204598.22652: variable 'ansible_shell_executable' from source: unknown 42613 1727204598.22661: variable 'ansible_connection' from source: unknown 42613 1727204598.22670: variable 'ansible_module_compression' from source: unknown 42613 1727204598.22707: variable 'ansible_shell_type' from source: unknown 42613 1727204598.22710: variable 'ansible_shell_executable' from source: unknown 42613 1727204598.22712: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.22714: variable 'ansible_pipelining' from source: unknown 42613 1727204598.22716: variable 'ansible_timeout' from source: unknown 42613 1727204598.22718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.22872: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204598.22889: variable 'omit' from source: magic vars 42613 1727204598.22924: starting attempt loop 42613 1727204598.22927: running the handler 42613 1727204598.23108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204598.23391: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204598.23446: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204598.23901: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204598.23923: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204598.24031: variable 'connection_route_rule6' from source: set_fact 42613 1727204598.24074: Evaluated conditional (connection_route_rule6.stdout is search("priority 30600 to 2001:db8::4/32 table 30600")): True 42613 1727204598.24334: variable 'connection_route_rule6' from source: set_fact 42613 1727204598.24349: Evaluated conditional (connection_route_rule6.stdout is search("priority 30601 not from ::/0 dport 128-256 table 30600") or connection_route_rule6.stdout is search("not priority 30601 from ::/0 dport 128-256 table 30600")): True 42613 1727204598.24509: variable 'connection_route_rule6' from source: set_fact 42613 1727204598.24542: Evaluated conditional (connection_route_rule6.stdout is search("priority 30602 from ::/0 table 30600")): True 42613 1727204598.24659: handler run complete 42613 1727204598.24663: attempt loop complete, returning result 42613 1727204598.24666: _execute() done 42613 1727204598.24669: dumping result to json 42613 1727204598.24672: done dumping result, returning 42613 1727204598.24674: done running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" [127b8e07-fff9-2f91-05d8-000000000067] 42613 1727204598.24676: sending task result for task 127b8e07-fff9-2f91-05d8-000000000067 42613 1727204598.24756: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000067 42613 1727204598.24759: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204598.24815: no more pending results, returning what we have 42613 1727204598.24819: results queue empty 42613 1727204598.24821: checking for any_errors_fatal 42613 1727204598.24834: done checking for any_errors_fatal 42613 1727204598.24835: checking for max_fail_percentage 42613 1727204598.24840: done checking for max_fail_percentage 42613 1727204598.24840: checking to see if all hosts have failed and the running result is not ok 42613 1727204598.24841: done checking to see if all hosts have failed 42613 1727204598.24842: getting the remaining hosts for this loop 42613 1727204598.24844: done getting the remaining hosts for this loop 42613 1727204598.24848: getting the next task for host managed-node3 42613 1727204598.24856: done getting next task for host managed-node3 42613 1727204598.24859: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 42613 1727204598.24862: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204598.24867: getting variables 42613 1727204598.24869: in VariableManager get_vars() 42613 1727204598.24912: Calling all_inventory to load vars for managed-node3 42613 1727204598.24915: Calling groups_inventory to load vars for managed-node3 42613 1727204598.24917: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204598.24930: Calling all_plugins_play to load vars for managed-node3 42613 1727204598.24933: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204598.24936: Calling groups_plugins_play to load vars for managed-node3 42613 1727204598.27226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204598.29524: done with get_vars() 42613 1727204598.29569: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:213 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.096) 0:00:26.904 ***** 42613 1727204598.29678: entering _queue_task() for managed-node3/file 42613 1727204598.30063: worker is 1 (out of 1 available) 42613 1727204598.30078: exiting _queue_task() for managed-node3/file 42613 1727204598.30091: done queuing things up, now waiting for results queue to drain 42613 1727204598.30092: waiting for pending results... 42613 1727204598.30493: running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 42613 1727204598.30560: in run() - task 127b8e07-fff9-2f91-05d8-000000000068 42613 1727204598.30592: variable 'ansible_search_path' from source: unknown 42613 1727204598.30641: calling self._execute() 42613 1727204598.30817: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.30822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.30825: variable 'omit' from source: magic vars 42613 1727204598.31321: variable 'ansible_distribution_major_version' from source: facts 42613 1727204598.31364: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204598.31457: variable 'omit' from source: magic vars 42613 1727204598.31462: variable 'omit' from source: magic vars 42613 1727204598.31481: variable 'omit' from source: magic vars 42613 1727204598.31552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204598.31617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204598.31658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204598.31701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204598.31718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204598.31760: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204598.31771: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.31892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.31930: Set connection var ansible_shell_executable to /bin/sh 42613 1727204598.31945: Set connection var ansible_pipelining to False 42613 1727204598.31959: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204598.31968: Set connection var ansible_connection to ssh 42613 1727204598.31980: Set connection var ansible_timeout to 10 42613 1727204598.31987: Set connection var ansible_shell_type to sh 42613 1727204598.32037: variable 'ansible_shell_executable' from source: unknown 42613 1727204598.32050: variable 'ansible_connection' from source: unknown 42613 1727204598.32058: variable 'ansible_module_compression' from source: unknown 42613 1727204598.32068: variable 'ansible_shell_type' from source: unknown 42613 1727204598.32077: variable 'ansible_shell_executable' from source: unknown 42613 1727204598.32084: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204598.32093: variable 'ansible_pipelining' from source: unknown 42613 1727204598.32112: variable 'ansible_timeout' from source: unknown 42613 1727204598.32122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204598.32421: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204598.32449: variable 'omit' from source: magic vars 42613 1727204598.32461: starting attempt loop 42613 1727204598.32471: running the handler 42613 1727204598.32492: _low_level_execute_command(): starting 42613 1727204598.32545: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204598.33600: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204598.33694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204598.33778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204598.35642: stdout chunk (state=3): >>>/root <<< 42613 1727204598.35791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204598.35854: stderr chunk (state=3): >>><<< 42613 1727204598.35874: stdout chunk (state=3): >>><<< 42613 1727204598.36019: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204598.36024: _low_level_execute_command(): starting 42613 1727204598.36027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212 `" && echo ansible-tmp-1727204598.3590932-44605-30914860579212="` echo /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212 `" ) && sleep 0' 42613 1727204598.36661: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204598.36680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204598.36699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204598.36720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204598.36734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204598.36815: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204598.36855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204598.36872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204598.36926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204598.37039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204598.39203: stdout chunk (state=3): >>>ansible-tmp-1727204598.3590932-44605-30914860579212=/root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212 <<< 42613 1727204598.39397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204598.39441: stderr chunk (state=3): >>><<< 42613 1727204598.39456: stdout chunk (state=3): >>><<< 42613 1727204598.39499: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204598.3590932-44605-30914860579212=/root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204598.39611: variable 'ansible_module_compression' from source: unknown 42613 1727204598.39644: ANSIBALLZ: Using lock for file 42613 1727204598.39652: ANSIBALLZ: Acquiring lock 42613 1727204598.39659: ANSIBALLZ: Lock acquired: 139982757273312 42613 1727204598.39669: ANSIBALLZ: Creating module 42613 1727204598.54873: ANSIBALLZ: Writing module into payload 42613 1727204598.55001: ANSIBALLZ: Writing module 42613 1727204598.55059: ANSIBALLZ: Renaming module 42613 1727204598.55062: ANSIBALLZ: Done creating module 42613 1727204598.55064: variable 'ansible_facts' from source: unknown 42613 1727204598.55395: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/AnsiballZ_file.py 42613 1727204598.55399: Sending initial data 42613 1727204598.55401: Sent initial data (152 bytes) 42613 1727204598.56146: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204598.56163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204598.56191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204598.56301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204598.58100: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 42613 1727204598.58119: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204598.58195: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204598.58263: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp84ni7fvt /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/AnsiballZ_file.py <<< 42613 1727204598.58269: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/AnsiballZ_file.py" <<< 42613 1727204598.58345: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp84ni7fvt" to remote "/root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/AnsiballZ_file.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/AnsiballZ_file.py" <<< 42613 1727204598.59293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204598.59464: stderr chunk (state=3): >>><<< 42613 1727204598.59469: stdout chunk (state=3): >>><<< 42613 1727204598.59471: done transferring module to remote 42613 1727204598.59473: _low_level_execute_command(): starting 42613 1727204598.59475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/ /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/AnsiballZ_file.py && sleep 0' 42613 1727204598.60095: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204598.60109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204598.60133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204598.60234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204598.62252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204598.62315: stderr chunk (state=3): >>><<< 42613 1727204598.62319: stdout chunk (state=3): >>><<< 42613 1727204598.62335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204598.62341: _low_level_execute_command(): starting 42613 1727204598.62347: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/AnsiballZ_file.py && sleep 0' 42613 1727204598.62919: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204598.62924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204598.62927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204598.63034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204598.63038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204598.63140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204598.81652: stdout chunk (state=3): >>> <<< 42613 1727204598.81658: stdout chunk (state=3): >>>{"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 42613 1727204598.83233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204598.83322: stderr chunk (state=3): >>><<< 42613 1727204598.83326: stdout chunk (state=3): >>><<< 42613 1727204598.83348: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204598.83446: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204598.83450: _low_level_execute_command(): starting 42613 1727204598.83459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204598.3590932-44605-30914860579212/ > /dev/null 2>&1 && sleep 0' 42613 1727204598.84179: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204598.84214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204598.84241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204598.84263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204598.84292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204598.84341: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204598.84349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204598.84368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204598.84448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204598.84450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204598.84452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204598.84515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204598.86774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204598.86779: stdout chunk (state=3): >>><<< 42613 1727204598.86781: stderr chunk (state=3): >>><<< 42613 1727204598.86784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204598.86786: handler run complete 42613 1727204598.86788: attempt loop complete, returning result 42613 1727204598.86789: _execute() done 42613 1727204598.86791: dumping result to json 42613 1727204598.86793: done dumping result, returning 42613 1727204598.86794: done running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [127b8e07-fff9-2f91-05d8-000000000068] 42613 1727204598.86796: sending task result for task 127b8e07-fff9-2f91-05d8-000000000068 42613 1727204598.86877: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000068 42613 1727204598.86880: WORKER PROCESS EXITING changed: [managed-node3] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 42613 1727204598.87030: no more pending results, returning what we have 42613 1727204598.87033: results queue empty 42613 1727204598.87035: checking for any_errors_fatal 42613 1727204598.87041: done checking for any_errors_fatal 42613 1727204598.87042: checking for max_fail_percentage 42613 1727204598.87044: done checking for max_fail_percentage 42613 1727204598.87045: checking to see if all hosts have failed and the running result is not ok 42613 1727204598.87046: done checking to see if all hosts have failed 42613 1727204598.87047: getting the remaining hosts for this loop 42613 1727204598.87048: done getting the remaining hosts for this loop 42613 1727204598.87053: getting the next task for host managed-node3 42613 1727204598.87060: done getting next task for host managed-node3 42613 1727204598.87063: ^ task is: TASK: meta (flush_handlers) 42613 1727204598.87068: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204598.87073: getting variables 42613 1727204598.87074: in VariableManager get_vars() 42613 1727204598.87115: Calling all_inventory to load vars for managed-node3 42613 1727204598.87117: Calling groups_inventory to load vars for managed-node3 42613 1727204598.87120: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204598.87134: Calling all_plugins_play to load vars for managed-node3 42613 1727204598.87138: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204598.87142: Calling groups_plugins_play to load vars for managed-node3 42613 1727204598.89353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204598.91749: done with get_vars() 42613 1727204598.91789: done getting variables 42613 1727204598.91881: in VariableManager get_vars() 42613 1727204598.91896: Calling all_inventory to load vars for managed-node3 42613 1727204598.91899: Calling groups_inventory to load vars for managed-node3 42613 1727204598.91901: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204598.91906: Calling all_plugins_play to load vars for managed-node3 42613 1727204598.91909: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204598.91912: Calling groups_plugins_play to load vars for managed-node3 42613 1727204598.93581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204598.95947: done with get_vars() 42613 1727204598.96000: done queuing things up, now waiting for results queue to drain 42613 1727204598.96003: results queue empty 42613 1727204598.96004: checking for any_errors_fatal 42613 1727204598.96008: done checking for any_errors_fatal 42613 1727204598.96008: checking for max_fail_percentage 42613 1727204598.96009: done checking for max_fail_percentage 42613 1727204598.96010: checking to see if all hosts have failed and the running result is not ok 42613 1727204598.96011: done checking to see if all hosts have failed 42613 1727204598.96012: getting the remaining hosts for this loop 42613 1727204598.96012: done getting the remaining hosts for this loop 42613 1727204598.96015: getting the next task for host managed-node3 42613 1727204598.96021: done getting next task for host managed-node3 42613 1727204598.96022: ^ task is: TASK: meta (flush_handlers) 42613 1727204598.96024: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204598.96027: getting variables 42613 1727204598.96028: in VariableManager get_vars() 42613 1727204598.96042: Calling all_inventory to load vars for managed-node3 42613 1727204598.96044: Calling groups_inventory to load vars for managed-node3 42613 1727204598.96046: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204598.96055: Calling all_plugins_play to load vars for managed-node3 42613 1727204598.96057: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204598.96059: Calling groups_plugins_play to load vars for managed-node3 42613 1727204598.97920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204599.00941: done with get_vars() 42613 1727204599.01058: done getting variables 42613 1727204599.01125: in VariableManager get_vars() 42613 1727204599.01141: Calling all_inventory to load vars for managed-node3 42613 1727204599.01143: Calling groups_inventory to load vars for managed-node3 42613 1727204599.01146: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204599.01266: Calling all_plugins_play to load vars for managed-node3 42613 1727204599.01282: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204599.01287: Calling groups_plugins_play to load vars for managed-node3 42613 1727204599.11028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204599.14417: done with get_vars() 42613 1727204599.14461: done queuing things up, now waiting for results queue to drain 42613 1727204599.14463: results queue empty 42613 1727204599.14464: checking for any_errors_fatal 42613 1727204599.14468: done checking for any_errors_fatal 42613 1727204599.14469: checking for max_fail_percentage 42613 1727204599.14470: done checking for max_fail_percentage 42613 1727204599.14471: checking to see if all hosts have failed and the running result is not ok 42613 1727204599.14472: done checking to see if all hosts have failed 42613 1727204599.14472: getting the remaining hosts for this loop 42613 1727204599.14473: done getting the remaining hosts for this loop 42613 1727204599.14476: getting the next task for host managed-node3 42613 1727204599.14480: done getting next task for host managed-node3 42613 1727204599.14481: ^ task is: None 42613 1727204599.14483: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204599.14484: done queuing things up, now waiting for results queue to drain 42613 1727204599.14485: results queue empty 42613 1727204599.14486: checking for any_errors_fatal 42613 1727204599.14487: done checking for any_errors_fatal 42613 1727204599.14487: checking for max_fail_percentage 42613 1727204599.14488: done checking for max_fail_percentage 42613 1727204599.14489: checking to see if all hosts have failed and the running result is not ok 42613 1727204599.14490: done checking to see if all hosts have failed 42613 1727204599.14492: getting the next task for host managed-node3 42613 1727204599.14495: done getting next task for host managed-node3 42613 1727204599.14496: ^ task is: None 42613 1727204599.14497: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204599.14548: in VariableManager get_vars() 42613 1727204599.14575: done with get_vars() 42613 1727204599.14582: in VariableManager get_vars() 42613 1727204599.14596: done with get_vars() 42613 1727204599.14600: variable 'omit' from source: magic vars 42613 1727204599.14703: variable 'profile' from source: play vars 42613 1727204599.14816: in VariableManager get_vars() 42613 1727204599.14832: done with get_vars() 42613 1727204599.14855: variable 'omit' from source: magic vars 42613 1727204599.14929: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 42613 1727204599.15768: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 42613 1727204599.15795: getting the remaining hosts for this loop 42613 1727204599.15796: done getting the remaining hosts for this loop 42613 1727204599.15799: getting the next task for host managed-node3 42613 1727204599.15802: done getting next task for host managed-node3 42613 1727204599.15804: ^ task is: TASK: Gathering Facts 42613 1727204599.15806: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204599.15808: getting variables 42613 1727204599.15809: in VariableManager get_vars() 42613 1727204599.15824: Calling all_inventory to load vars for managed-node3 42613 1727204599.15826: Calling groups_inventory to load vars for managed-node3 42613 1727204599.15828: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204599.15835: Calling all_plugins_play to load vars for managed-node3 42613 1727204599.15837: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204599.15840: Calling groups_plugins_play to load vars for managed-node3 42613 1727204599.17318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204599.19458: done with get_vars() 42613 1727204599.19494: done getting variables 42613 1727204599.19545: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.898) 0:00:27.803 ***** 42613 1727204599.19575: entering _queue_task() for managed-node3/gather_facts 42613 1727204599.19938: worker is 1 (out of 1 available) 42613 1727204599.19951: exiting _queue_task() for managed-node3/gather_facts 42613 1727204599.19963: done queuing things up, now waiting for results queue to drain 42613 1727204599.19964: waiting for pending results... 42613 1727204599.20261: running TaskExecutor() for managed-node3/TASK: Gathering Facts 42613 1727204599.20503: in run() - task 127b8e07-fff9-2f91-05d8-0000000004b1 42613 1727204599.20508: variable 'ansible_search_path' from source: unknown 42613 1727204599.20511: calling self._execute() 42613 1727204599.20557: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204599.20575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204599.20588: variable 'omit' from source: magic vars 42613 1727204599.21014: variable 'ansible_distribution_major_version' from source: facts 42613 1727204599.21031: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204599.21048: variable 'omit' from source: magic vars 42613 1727204599.21088: variable 'omit' from source: magic vars 42613 1727204599.21135: variable 'omit' from source: magic vars 42613 1727204599.21194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204599.21242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204599.21277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204599.21302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204599.21321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204599.21361: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204599.21378: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204599.21473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204599.21516: Set connection var ansible_shell_executable to /bin/sh 42613 1727204599.21529: Set connection var ansible_pipelining to False 42613 1727204599.21543: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204599.21550: Set connection var ansible_connection to ssh 42613 1727204599.21561: Set connection var ansible_timeout to 10 42613 1727204599.21572: Set connection var ansible_shell_type to sh 42613 1727204599.21607: variable 'ansible_shell_executable' from source: unknown 42613 1727204599.21615: variable 'ansible_connection' from source: unknown 42613 1727204599.21623: variable 'ansible_module_compression' from source: unknown 42613 1727204599.21631: variable 'ansible_shell_type' from source: unknown 42613 1727204599.21638: variable 'ansible_shell_executable' from source: unknown 42613 1727204599.21646: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204599.21653: variable 'ansible_pipelining' from source: unknown 42613 1727204599.21660: variable 'ansible_timeout' from source: unknown 42613 1727204599.21671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204599.21888: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204599.21906: variable 'omit' from source: magic vars 42613 1727204599.21971: starting attempt loop 42613 1727204599.21975: running the handler 42613 1727204599.21977: variable 'ansible_facts' from source: unknown 42613 1727204599.21980: _low_level_execute_command(): starting 42613 1727204599.21988: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204599.23009: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204599.23301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204599.23306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204599.23497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204599.25674: stdout chunk (state=3): >>>/root <<< 42613 1727204599.25678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204599.25680: stdout chunk (state=3): >>><<< 42613 1727204599.25683: stderr chunk (state=3): >>><<< 42613 1727204599.25685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204599.25689: _low_level_execute_command(): starting 42613 1727204599.25691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728 `" && echo ansible-tmp-1727204599.2556922-44632-59985817462728="` echo /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728 `" ) && sleep 0' 42613 1727204599.26682: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204599.26686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204599.26689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204599.26991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204599.27288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204599.27377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204599.29691: stdout chunk (state=3): >>>ansible-tmp-1727204599.2556922-44632-59985817462728=/root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728 <<< 42613 1727204599.29695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204599.29774: stderr chunk (state=3): >>><<< 42613 1727204599.29779: stdout chunk (state=3): >>><<< 42613 1727204599.29799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204599.2556922-44632-59985817462728=/root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204599.30073: variable 'ansible_module_compression' from source: unknown 42613 1727204599.30077: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 42613 1727204599.30079: variable 'ansible_facts' from source: unknown 42613 1727204599.30646: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/AnsiballZ_setup.py 42613 1727204599.31296: Sending initial data 42613 1727204599.31308: Sent initial data (153 bytes) 42613 1727204599.32542: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204599.32689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204599.32762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204599.32854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204599.32872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204599.32975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204599.34791: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204599.34850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204599.34928: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpyhuzqhfr /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/AnsiballZ_setup.py <<< 42613 1727204599.34931: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/AnsiballZ_setup.py" <<< 42613 1727204599.35029: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpyhuzqhfr" to remote "/root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/AnsiballZ_setup.py" <<< 42613 1727204599.38392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204599.38484: stderr chunk (state=3): >>><<< 42613 1727204599.38495: stdout chunk (state=3): >>><<< 42613 1727204599.38532: done transferring module to remote 42613 1727204599.38590: _low_level_execute_command(): starting 42613 1727204599.38601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/ /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/AnsiballZ_setup.py && sleep 0' 42613 1727204599.39970: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204599.39992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204599.40106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204599.40408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204599.40483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204599.42624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204599.42629: stdout chunk (state=3): >>><<< 42613 1727204599.42631: stderr chunk (state=3): >>><<< 42613 1727204599.42650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204599.42659: _low_level_execute_command(): starting 42613 1727204599.42672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/AnsiballZ_setup.py && sleep 0' 42613 1727204599.44573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204599.44592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204599.44608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204599.44848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204599.44870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204599.45092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204600.16071: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.537109375, "5m": 0.6171875, "15m": 0.419921875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3048, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 668, "free": 3048}, "nocache": {"free": 3493, "used": 223}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansi<<< 42613 1727204600.16123: stdout chunk (state=3): >>>ble_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 937, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303661568, "block_size": 4096, "block_total": 64479564, "block_available": 61353433, "block_used": 3126131, "inode_total": 16384000, "inode_available": 16301441, "inode_used": 82559, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["ethtest0", "eth0", "lo", "peerethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "<<< 42613 1727204600.16157: stdout chunk (state=3): >>>esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f6:13:d9:76:0f:3f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::f413:d9ff:fe76:f3f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ce:18:00:21:fb:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::cc18:ff:fe21:fb93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13", "2001:db8::2", "fe80::f413:d9ff:fe76:f3f", "fe80::cc18:ff:fe21:fb93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::aa:78ff:fea8:9b13", "fe80::cc18:ff:fe21:fb93", "fe80::f413:d9ff:fe76:f3f"]}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "20", "epoch": "1727204600", "epoch_int": "1727204600", "date": "2024-09-24", "time": "15:03:20", "iso8601_micro": "2024-09-24T19:03:20.156227Z", "iso8601": "2024-09-24T19:03:20Z", "iso8601_basic": "20240924T150320156227", "iso8601_basic_short": "20240924T150320", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 42613 1727204600.18450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204600.18482: stdout chunk (state=3): >>><<< 42613 1727204600.18871: stderr chunk (state=3): >>><<< 42613 1727204600.18879: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.537109375, "5m": 0.6171875, "15m": 0.419921875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3048, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 668, "free": 3048}, "nocache": {"free": 3493, "used": 223}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 937, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303661568, "block_size": 4096, "block_total": 64479564, "block_available": 61353433, "block_used": 3126131, "inode_total": 16384000, "inode_available": 16301441, "inode_used": 82559, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["ethtest0", "eth0", "lo", "peerethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f6:13:d9:76:0f:3f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::f413:d9ff:fe76:f3f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ce:18:00:21:fb:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::cc18:ff:fe21:fb93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13", "2001:db8::2", "fe80::f413:d9ff:fe76:f3f", "fe80::cc18:ff:fe21:fb93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::aa:78ff:fea8:9b13", "fe80::cc18:ff:fe21:fb93", "fe80::f413:d9ff:fe76:f3f"]}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "20", "epoch": "1727204600", "epoch_int": "1727204600", "date": "2024-09-24", "time": "15:03:20", "iso8601_micro": "2024-09-24T19:03:20.156227Z", "iso8601": "2024-09-24T19:03:20Z", "iso8601_basic": "20240924T150320156227", "iso8601_basic_short": "20240924T150320", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204600.19498: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204600.19530: _low_level_execute_command(): starting 42613 1727204600.19539: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204599.2556922-44632-59985817462728/ > /dev/null 2>&1 && sleep 0' 42613 1727204600.20181: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204600.20197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204600.20282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204600.20323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204600.20342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204600.20370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204600.20475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204600.22791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204600.23117: stderr chunk (state=3): >>><<< 42613 1727204600.23121: stdout chunk (state=3): >>><<< 42613 1727204600.23123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204600.23126: handler run complete 42613 1727204600.23407: variable 'ansible_facts' from source: unknown 42613 1727204600.23905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.24855: variable 'ansible_facts' from source: unknown 42613 1727204600.25115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.25609: attempt loop complete, returning result 42613 1727204600.25613: _execute() done 42613 1727204600.25615: dumping result to json 42613 1727204600.25698: done dumping result, returning 42613 1727204600.25721: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-2f91-05d8-0000000004b1] 42613 1727204600.25825: sending task result for task 127b8e07-fff9-2f91-05d8-0000000004b1 42613 1727204600.27686: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000004b1 42613 1727204600.27691: WORKER PROCESS EXITING ok: [managed-node3] 42613 1727204600.28479: no more pending results, returning what we have 42613 1727204600.28483: results queue empty 42613 1727204600.28484: checking for any_errors_fatal 42613 1727204600.28486: done checking for any_errors_fatal 42613 1727204600.28487: checking for max_fail_percentage 42613 1727204600.28488: done checking for max_fail_percentage 42613 1727204600.28489: checking to see if all hosts have failed and the running result is not ok 42613 1727204600.28490: done checking to see if all hosts have failed 42613 1727204600.28491: getting the remaining hosts for this loop 42613 1727204600.28492: done getting the remaining hosts for this loop 42613 1727204600.28497: getting the next task for host managed-node3 42613 1727204600.28502: done getting next task for host managed-node3 42613 1727204600.28504: ^ task is: TASK: meta (flush_handlers) 42613 1727204600.28506: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204600.28511: getting variables 42613 1727204600.28512: in VariableManager get_vars() 42613 1727204600.28544: Calling all_inventory to load vars for managed-node3 42613 1727204600.28547: Calling groups_inventory to load vars for managed-node3 42613 1727204600.28549: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.28560: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.28670: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.28682: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.32348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.37304: done with get_vars() 42613 1727204600.37352: done getting variables 42613 1727204600.37484: in VariableManager get_vars() 42613 1727204600.37500: Calling all_inventory to load vars for managed-node3 42613 1727204600.37503: Calling groups_inventory to load vars for managed-node3 42613 1727204600.37505: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.37511: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.37514: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.37517: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.39442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.42064: done with get_vars() 42613 1727204600.42115: done queuing things up, now waiting for results queue to drain 42613 1727204600.42118: results queue empty 42613 1727204600.42119: checking for any_errors_fatal 42613 1727204600.42123: done checking for any_errors_fatal 42613 1727204600.42124: checking for max_fail_percentage 42613 1727204600.42125: done checking for max_fail_percentage 42613 1727204600.42131: checking to see if all hosts have failed and the running result is not ok 42613 1727204600.42132: done checking to see if all hosts have failed 42613 1727204600.42133: getting the remaining hosts for this loop 42613 1727204600.42134: done getting the remaining hosts for this loop 42613 1727204600.42137: getting the next task for host managed-node3 42613 1727204600.42142: done getting next task for host managed-node3 42613 1727204600.42145: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 42613 1727204600.42147: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204600.42158: getting variables 42613 1727204600.42159: in VariableManager get_vars() 42613 1727204600.42179: Calling all_inventory to load vars for managed-node3 42613 1727204600.42182: Calling groups_inventory to load vars for managed-node3 42613 1727204600.42184: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.42190: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.42193: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.42196: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.44088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.48003: done with get_vars() 42613 1727204600.48045: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:20 -0400 (0:00:01.286) 0:00:29.090 ***** 42613 1727204600.48251: entering _queue_task() for managed-node3/include_tasks 42613 1727204600.49166: worker is 1 (out of 1 available) 42613 1727204600.49182: exiting _queue_task() for managed-node3/include_tasks 42613 1727204600.49198: done queuing things up, now waiting for results queue to drain 42613 1727204600.49200: waiting for pending results... 42613 1727204600.50186: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 42613 1727204600.50677: in run() - task 127b8e07-fff9-2f91-05d8-000000000071 42613 1727204600.50684: variable 'ansible_search_path' from source: unknown 42613 1727204600.50688: variable 'ansible_search_path' from source: unknown 42613 1727204600.50692: calling self._execute() 42613 1727204600.50696: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204600.50700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204600.50784: variable 'omit' from source: magic vars 42613 1727204600.51694: variable 'ansible_distribution_major_version' from source: facts 42613 1727204600.51714: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204600.51727: _execute() done 42613 1727204600.51736: dumping result to json 42613 1727204600.51744: done dumping result, returning 42613 1727204600.51757: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-2f91-05d8-000000000071] 42613 1727204600.51772: sending task result for task 127b8e07-fff9-2f91-05d8-000000000071 42613 1727204600.51952: no more pending results, returning what we have 42613 1727204600.51958: in VariableManager get_vars() 42613 1727204600.52018: Calling all_inventory to load vars for managed-node3 42613 1727204600.52022: Calling groups_inventory to load vars for managed-node3 42613 1727204600.52024: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.52042: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.52045: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.52048: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.52725: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000071 42613 1727204600.52729: WORKER PROCESS EXITING 42613 1727204600.55846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.60779: done with get_vars() 42613 1727204600.60825: variable 'ansible_search_path' from source: unknown 42613 1727204600.60827: variable 'ansible_search_path' from source: unknown 42613 1727204600.60864: we have included files to process 42613 1727204600.60868: generating all_blocks data 42613 1727204600.60869: done generating all_blocks data 42613 1727204600.60870: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204600.60872: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204600.60874: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204600.62370: done processing included file 42613 1727204600.62372: iterating over new_blocks loaded from include file 42613 1727204600.62374: in VariableManager get_vars() 42613 1727204600.62398: done with get_vars() 42613 1727204600.62400: filtering new block on tags 42613 1727204600.62418: done filtering new block on tags 42613 1727204600.62421: in VariableManager get_vars() 42613 1727204600.62558: done with get_vars() 42613 1727204600.62560: filtering new block on tags 42613 1727204600.62583: done filtering new block on tags 42613 1727204600.62586: in VariableManager get_vars() 42613 1727204600.62608: done with get_vars() 42613 1727204600.62610: filtering new block on tags 42613 1727204600.62625: done filtering new block on tags 42613 1727204600.62627: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 42613 1727204600.62634: extending task lists for all hosts with included blocks 42613 1727204600.63591: done extending task lists 42613 1727204600.63593: done processing included files 42613 1727204600.63594: results queue empty 42613 1727204600.63594: checking for any_errors_fatal 42613 1727204600.63596: done checking for any_errors_fatal 42613 1727204600.63597: checking for max_fail_percentage 42613 1727204600.63598: done checking for max_fail_percentage 42613 1727204600.63599: checking to see if all hosts have failed and the running result is not ok 42613 1727204600.63600: done checking to see if all hosts have failed 42613 1727204600.63601: getting the remaining hosts for this loop 42613 1727204600.63602: done getting the remaining hosts for this loop 42613 1727204600.63605: getting the next task for host managed-node3 42613 1727204600.63608: done getting next task for host managed-node3 42613 1727204600.63611: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 42613 1727204600.63614: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204600.63625: getting variables 42613 1727204600.63627: in VariableManager get_vars() 42613 1727204600.63763: Calling all_inventory to load vars for managed-node3 42613 1727204600.63769: Calling groups_inventory to load vars for managed-node3 42613 1727204600.63771: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.63777: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.63779: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.63782: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.67648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.70287: done with get_vars() 42613 1727204600.70326: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.221) 0:00:29.312 ***** 42613 1727204600.70425: entering _queue_task() for managed-node3/setup 42613 1727204600.71245: worker is 1 (out of 1 available) 42613 1727204600.71261: exiting _queue_task() for managed-node3/setup 42613 1727204600.71430: done queuing things up, now waiting for results queue to drain 42613 1727204600.71433: waiting for pending results... 42613 1727204600.71809: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 42613 1727204600.71997: in run() - task 127b8e07-fff9-2f91-05d8-0000000004f2 42613 1727204600.72015: variable 'ansible_search_path' from source: unknown 42613 1727204600.72025: variable 'ansible_search_path' from source: unknown 42613 1727204600.72082: calling self._execute() 42613 1727204600.72214: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204600.72227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204600.72245: variable 'omit' from source: magic vars 42613 1727204600.72711: variable 'ansible_distribution_major_version' from source: facts 42613 1727204600.72736: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204600.73021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204600.76001: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204600.76095: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204600.76172: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204600.76200: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204600.76271: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204600.76348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204600.76389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204600.76423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204600.76488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204600.76555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204600.76586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204600.76615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204600.76650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204600.76707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204600.76727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204600.76937: variable '__network_required_facts' from source: role '' defaults 42613 1727204600.76991: variable 'ansible_facts' from source: unknown 42613 1727204600.78165: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 42613 1727204600.78180: when evaluation is False, skipping this task 42613 1727204600.78194: _execute() done 42613 1727204600.78203: dumping result to json 42613 1727204600.78270: done dumping result, returning 42613 1727204600.78274: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-2f91-05d8-0000000004f2] 42613 1727204600.78276: sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f2 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204600.78461: no more pending results, returning what we have 42613 1727204600.78468: results queue empty 42613 1727204600.78469: checking for any_errors_fatal 42613 1727204600.78471: done checking for any_errors_fatal 42613 1727204600.78471: checking for max_fail_percentage 42613 1727204600.78473: done checking for max_fail_percentage 42613 1727204600.78474: checking to see if all hosts have failed and the running result is not ok 42613 1727204600.78476: done checking to see if all hosts have failed 42613 1727204600.78476: getting the remaining hosts for this loop 42613 1727204600.78478: done getting the remaining hosts for this loop 42613 1727204600.78483: getting the next task for host managed-node3 42613 1727204600.78494: done getting next task for host managed-node3 42613 1727204600.78498: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 42613 1727204600.78501: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204600.78520: getting variables 42613 1727204600.78522: in VariableManager get_vars() 42613 1727204600.78785: Calling all_inventory to load vars for managed-node3 42613 1727204600.78789: Calling groups_inventory to load vars for managed-node3 42613 1727204600.78792: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.78808: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.78811: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.78815: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.79398: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f2 42613 1727204600.79402: WORKER PROCESS EXITING 42613 1727204600.81246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.84173: done with get_vars() 42613 1727204600.84210: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.140) 0:00:29.452 ***** 42613 1727204600.84433: entering _queue_task() for managed-node3/stat 42613 1727204600.85562: worker is 1 (out of 1 available) 42613 1727204600.85675: exiting _queue_task() for managed-node3/stat 42613 1727204600.85693: done queuing things up, now waiting for results queue to drain 42613 1727204600.85695: waiting for pending results... 42613 1727204600.86609: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 42613 1727204600.87009: in run() - task 127b8e07-fff9-2f91-05d8-0000000004f4 42613 1727204600.87046: variable 'ansible_search_path' from source: unknown 42613 1727204600.87172: variable 'ansible_search_path' from source: unknown 42613 1727204600.87207: calling self._execute() 42613 1727204600.87345: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204600.87673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204600.87679: variable 'omit' from source: magic vars 42613 1727204600.88451: variable 'ansible_distribution_major_version' from source: facts 42613 1727204600.88478: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204600.88903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204600.89669: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204600.89736: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204600.89914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204600.89964: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204600.90275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204600.90279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204600.90282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204600.90284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204600.90444: variable '__network_is_ostree' from source: set_fact 42613 1727204600.90458: Evaluated conditional (not __network_is_ostree is defined): False 42613 1727204600.90468: when evaluation is False, skipping this task 42613 1727204600.90477: _execute() done 42613 1727204600.90484: dumping result to json 42613 1727204600.90493: done dumping result, returning 42613 1727204600.90505: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-2f91-05d8-0000000004f4] 42613 1727204600.90514: sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f4 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 42613 1727204600.90696: no more pending results, returning what we have 42613 1727204600.90700: results queue empty 42613 1727204600.90701: checking for any_errors_fatal 42613 1727204600.90708: done checking for any_errors_fatal 42613 1727204600.90709: checking for max_fail_percentage 42613 1727204600.90711: done checking for max_fail_percentage 42613 1727204600.90712: checking to see if all hosts have failed and the running result is not ok 42613 1727204600.90713: done checking to see if all hosts have failed 42613 1727204600.90714: getting the remaining hosts for this loop 42613 1727204600.90716: done getting the remaining hosts for this loop 42613 1727204600.90721: getting the next task for host managed-node3 42613 1727204600.90728: done getting next task for host managed-node3 42613 1727204600.90733: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 42613 1727204600.90736: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204600.90751: getting variables 42613 1727204600.90753: in VariableManager get_vars() 42613 1727204600.90799: Calling all_inventory to load vars for managed-node3 42613 1727204600.90803: Calling groups_inventory to load vars for managed-node3 42613 1727204600.90805: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.90819: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.90824: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.90828: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.91553: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f4 42613 1727204600.91558: WORKER PROCESS EXITING 42613 1727204600.93073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204600.95592: done with get_vars() 42613 1727204600.95628: done getting variables 42613 1727204600.95701: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.113) 0:00:29.565 ***** 42613 1727204600.95747: entering _queue_task() for managed-node3/set_fact 42613 1727204600.96259: worker is 1 (out of 1 available) 42613 1727204600.96274: exiting _queue_task() for managed-node3/set_fact 42613 1727204600.96287: done queuing things up, now waiting for results queue to drain 42613 1727204600.96288: waiting for pending results... 42613 1727204600.96510: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 42613 1727204600.96656: in run() - task 127b8e07-fff9-2f91-05d8-0000000004f5 42613 1727204600.96680: variable 'ansible_search_path' from source: unknown 42613 1727204600.96688: variable 'ansible_search_path' from source: unknown 42613 1727204600.96740: calling self._execute() 42613 1727204600.96858: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204600.96873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204600.96887: variable 'omit' from source: magic vars 42613 1727204600.97330: variable 'ansible_distribution_major_version' from source: facts 42613 1727204600.97355: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204600.97560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204600.97877: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204600.97943: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204600.97989: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204600.98038: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204600.98145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204600.98181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204600.98212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204600.98255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204600.98367: variable '__network_is_ostree' from source: set_fact 42613 1727204600.98380: Evaluated conditional (not __network_is_ostree is defined): False 42613 1727204600.98387: when evaluation is False, skipping this task 42613 1727204600.98393: _execute() done 42613 1727204600.98400: dumping result to json 42613 1727204600.98408: done dumping result, returning 42613 1727204600.98422: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-2f91-05d8-0000000004f5] 42613 1727204600.98433: sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f5 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 42613 1727204600.98719: no more pending results, returning what we have 42613 1727204600.98722: results queue empty 42613 1727204600.98724: checking for any_errors_fatal 42613 1727204600.98730: done checking for any_errors_fatal 42613 1727204600.98731: checking for max_fail_percentage 42613 1727204600.98734: done checking for max_fail_percentage 42613 1727204600.98734: checking to see if all hosts have failed and the running result is not ok 42613 1727204600.98735: done checking to see if all hosts have failed 42613 1727204600.98736: getting the remaining hosts for this loop 42613 1727204600.98738: done getting the remaining hosts for this loop 42613 1727204600.98742: getting the next task for host managed-node3 42613 1727204600.98753: done getting next task for host managed-node3 42613 1727204600.98757: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 42613 1727204600.98760: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204600.98780: getting variables 42613 1727204600.98782: in VariableManager get_vars() 42613 1727204600.98824: Calling all_inventory to load vars for managed-node3 42613 1727204600.98828: Calling groups_inventory to load vars for managed-node3 42613 1727204600.98830: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204600.98843: Calling all_plugins_play to load vars for managed-node3 42613 1727204600.98847: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204600.98850: Calling groups_plugins_play to load vars for managed-node3 42613 1727204600.99387: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f5 42613 1727204600.99391: WORKER PROCESS EXITING 42613 1727204601.01003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204601.02484: done with get_vars() 42613 1727204601.02512: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.068) 0:00:29.634 ***** 42613 1727204601.02597: entering _queue_task() for managed-node3/service_facts 42613 1727204601.02887: worker is 1 (out of 1 available) 42613 1727204601.02904: exiting _queue_task() for managed-node3/service_facts 42613 1727204601.02919: done queuing things up, now waiting for results queue to drain 42613 1727204601.02920: waiting for pending results... 42613 1727204601.03117: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 42613 1727204601.03238: in run() - task 127b8e07-fff9-2f91-05d8-0000000004f7 42613 1727204601.03258: variable 'ansible_search_path' from source: unknown 42613 1727204601.03262: variable 'ansible_search_path' from source: unknown 42613 1727204601.03313: calling self._execute() 42613 1727204601.03404: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204601.03409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204601.03412: variable 'omit' from source: magic vars 42613 1727204601.03825: variable 'ansible_distribution_major_version' from source: facts 42613 1727204601.03854: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204601.04073: variable 'omit' from source: magic vars 42613 1727204601.04077: variable 'omit' from source: magic vars 42613 1727204601.04080: variable 'omit' from source: magic vars 42613 1727204601.04082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204601.04114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204601.04151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204601.04179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204601.04200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204601.04240: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204601.04250: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204601.04259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204601.04386: Set connection var ansible_shell_executable to /bin/sh 42613 1727204601.04405: Set connection var ansible_pipelining to False 42613 1727204601.04419: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204601.04427: Set connection var ansible_connection to ssh 42613 1727204601.04444: Set connection var ansible_timeout to 10 42613 1727204601.04453: Set connection var ansible_shell_type to sh 42613 1727204601.04482: variable 'ansible_shell_executable' from source: unknown 42613 1727204601.04491: variable 'ansible_connection' from source: unknown 42613 1727204601.04503: variable 'ansible_module_compression' from source: unknown 42613 1727204601.04513: variable 'ansible_shell_type' from source: unknown 42613 1727204601.04520: variable 'ansible_shell_executable' from source: unknown 42613 1727204601.04526: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204601.04533: variable 'ansible_pipelining' from source: unknown 42613 1727204601.04542: variable 'ansible_timeout' from source: unknown 42613 1727204601.04554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204601.04771: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204601.04780: variable 'omit' from source: magic vars 42613 1727204601.04783: starting attempt loop 42613 1727204601.04786: running the handler 42613 1727204601.04798: _low_level_execute_command(): starting 42613 1727204601.04870: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204601.05879: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204601.05918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204601.07749: stdout chunk (state=3): >>>/root <<< 42613 1727204601.07852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204601.07919: stderr chunk (state=3): >>><<< 42613 1727204601.07926: stdout chunk (state=3): >>><<< 42613 1727204601.07957: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204601.07975: _low_level_execute_command(): starting 42613 1727204601.07984: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887 `" && echo ansible-tmp-1727204601.0796328-44694-68513187746887="` echo /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887 `" ) && sleep 0' 42613 1727204601.08459: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204601.08480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204601.08497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204601.08555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204601.08701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204601.08901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204601.10984: stdout chunk (state=3): >>>ansible-tmp-1727204601.0796328-44694-68513187746887=/root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887 <<< 42613 1727204601.11204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204601.11207: stdout chunk (state=3): >>><<< 42613 1727204601.11210: stderr chunk (state=3): >>><<< 42613 1727204601.11227: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204601.0796328-44694-68513187746887=/root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204601.11358: variable 'ansible_module_compression' from source: unknown 42613 1727204601.11362: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 42613 1727204601.11407: variable 'ansible_facts' from source: unknown 42613 1727204601.11505: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/AnsiballZ_service_facts.py 42613 1727204601.11709: Sending initial data 42613 1727204601.11712: Sent initial data (161 bytes) 42613 1727204601.12393: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204601.12463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204601.12526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204601.12563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204601.12581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204601.12693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204601.14495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204601.14576: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204601.14655: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpiv5ym5jg /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/AnsiballZ_service_facts.py <<< 42613 1727204601.14658: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/AnsiballZ_service_facts.py" <<< 42613 1727204601.14730: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpiv5ym5jg" to remote "/root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/AnsiballZ_service_facts.py" <<< 42613 1727204601.15774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204601.15778: stdout chunk (state=3): >>><<< 42613 1727204601.15781: stderr chunk (state=3): >>><<< 42613 1727204601.15783: done transferring module to remote 42613 1727204601.15785: _low_level_execute_command(): starting 42613 1727204601.15787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/ /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/AnsiballZ_service_facts.py && sleep 0' 42613 1727204601.16455: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204601.16475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204601.16573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204601.16607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204601.16623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204601.16648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204601.16761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204601.18888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204601.18906: stdout chunk (state=3): >>><<< 42613 1727204601.18924: stderr chunk (state=3): >>><<< 42613 1727204601.19045: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204601.19049: _low_level_execute_command(): starting 42613 1727204601.19052: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/AnsiballZ_service_facts.py && sleep 0' 42613 1727204601.19689: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204601.19750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204601.19771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204601.19789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204601.19903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204603.60139: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind<<< 42613 1727204603.60242: stdout chunk (state=3): >>>.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 42613 1727204603.62273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204603.62278: stdout chunk (state=3): >>><<< 42613 1727204603.62280: stderr chunk (state=3): >>><<< 42613 1727204603.62286: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204603.63833: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204603.63858: _low_level_execute_command(): starting 42613 1727204603.63873: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204601.0796328-44694-68513187746887/ > /dev/null 2>&1 && sleep 0' 42613 1727204603.64609: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204603.64631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204603.64650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204603.64675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204603.64693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204603.64705: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204603.64754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204603.64948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204603.64963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204603.65169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204603.67307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204603.67319: stdout chunk (state=3): >>><<< 42613 1727204603.67337: stderr chunk (state=3): >>><<< 42613 1727204603.67472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204603.67477: handler run complete 42613 1727204603.67641: variable 'ansible_facts' from source: unknown 42613 1727204603.67853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204603.68503: variable 'ansible_facts' from source: unknown 42613 1727204603.68699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204603.69019: attempt loop complete, returning result 42613 1727204603.69035: _execute() done 42613 1727204603.69045: dumping result to json 42613 1727204603.69131: done dumping result, returning 42613 1727204603.69153: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-2f91-05d8-0000000004f7] 42613 1727204603.69169: sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f7 42613 1727204603.71079: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f7 42613 1727204603.71083: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204603.71169: no more pending results, returning what we have 42613 1727204603.71173: results queue empty 42613 1727204603.71174: checking for any_errors_fatal 42613 1727204603.71178: done checking for any_errors_fatal 42613 1727204603.71179: checking for max_fail_percentage 42613 1727204603.71181: done checking for max_fail_percentage 42613 1727204603.71181: checking to see if all hosts have failed and the running result is not ok 42613 1727204603.71182: done checking to see if all hosts have failed 42613 1727204603.71183: getting the remaining hosts for this loop 42613 1727204603.71184: done getting the remaining hosts for this loop 42613 1727204603.71188: getting the next task for host managed-node3 42613 1727204603.71194: done getting next task for host managed-node3 42613 1727204603.71198: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 42613 1727204603.71201: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204603.71212: getting variables 42613 1727204603.71214: in VariableManager get_vars() 42613 1727204603.71248: Calling all_inventory to load vars for managed-node3 42613 1727204603.71251: Calling groups_inventory to load vars for managed-node3 42613 1727204603.71254: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204603.71306: Calling all_plugins_play to load vars for managed-node3 42613 1727204603.71311: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204603.71315: Calling groups_plugins_play to load vars for managed-node3 42613 1727204603.73092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204603.76454: done with get_vars() 42613 1727204603.76499: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:23 -0400 (0:00:02.740) 0:00:32.374 ***** 42613 1727204603.76632: entering _queue_task() for managed-node3/package_facts 42613 1727204603.77106: worker is 1 (out of 1 available) 42613 1727204603.77121: exiting _queue_task() for managed-node3/package_facts 42613 1727204603.77141: done queuing things up, now waiting for results queue to drain 42613 1727204603.77142: waiting for pending results... 42613 1727204603.77560: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 42613 1727204603.77774: in run() - task 127b8e07-fff9-2f91-05d8-0000000004f8 42613 1727204603.77779: variable 'ansible_search_path' from source: unknown 42613 1727204603.77782: variable 'ansible_search_path' from source: unknown 42613 1727204603.77856: calling self._execute() 42613 1727204603.78074: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204603.78149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204603.78152: variable 'omit' from source: magic vars 42613 1727204603.78550: variable 'ansible_distribution_major_version' from source: facts 42613 1727204603.78573: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204603.78592: variable 'omit' from source: magic vars 42613 1727204603.78663: variable 'omit' from source: magic vars 42613 1727204603.78721: variable 'omit' from source: magic vars 42613 1727204603.78778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204603.78872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204603.78876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204603.78889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204603.78912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204603.78953: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204603.78962: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204603.78971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204603.79100: Set connection var ansible_shell_executable to /bin/sh 42613 1727204603.79127: Set connection var ansible_pipelining to False 42613 1727204603.79130: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204603.79135: Set connection var ansible_connection to ssh 42613 1727204603.79271: Set connection var ansible_timeout to 10 42613 1727204603.79275: Set connection var ansible_shell_type to sh 42613 1727204603.79277: variable 'ansible_shell_executable' from source: unknown 42613 1727204603.79279: variable 'ansible_connection' from source: unknown 42613 1727204603.79282: variable 'ansible_module_compression' from source: unknown 42613 1727204603.79284: variable 'ansible_shell_type' from source: unknown 42613 1727204603.79286: variable 'ansible_shell_executable' from source: unknown 42613 1727204603.79288: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204603.79290: variable 'ansible_pipelining' from source: unknown 42613 1727204603.79292: variable 'ansible_timeout' from source: unknown 42613 1727204603.79294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204603.79468: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204603.79487: variable 'omit' from source: magic vars 42613 1727204603.79496: starting attempt loop 42613 1727204603.79502: running the handler 42613 1727204603.79528: _low_level_execute_command(): starting 42613 1727204603.79543: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204603.80511: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204603.80542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204603.80654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204603.82491: stdout chunk (state=3): >>>/root <<< 42613 1727204603.82755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204603.82759: stdout chunk (state=3): >>><<< 42613 1727204603.82762: stderr chunk (state=3): >>><<< 42613 1727204603.82789: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204603.82816: _low_level_execute_command(): starting 42613 1727204603.82945: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924 `" && echo ansible-tmp-1727204603.8279858-44786-88387230243924="` echo /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924 `" ) && sleep 0' 42613 1727204603.83617: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204603.83654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204603.83773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204603.83809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204603.83931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204603.86152: stdout chunk (state=3): >>>ansible-tmp-1727204603.8279858-44786-88387230243924=/root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924 <<< 42613 1727204603.86361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204603.86367: stdout chunk (state=3): >>><<< 42613 1727204603.86371: stderr chunk (state=3): >>><<< 42613 1727204603.86388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204603.8279858-44786-88387230243924=/root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204603.86449: variable 'ansible_module_compression' from source: unknown 42613 1727204603.86573: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 42613 1727204603.86589: variable 'ansible_facts' from source: unknown 42613 1727204603.86783: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/AnsiballZ_package_facts.py 42613 1727204603.86995: Sending initial data 42613 1727204603.86999: Sent initial data (161 bytes) 42613 1727204603.87747: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204603.87789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204603.87891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204603.89724: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204603.89810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204603.89898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmplkojvf_a /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/AnsiballZ_package_facts.py <<< 42613 1727204603.89903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/AnsiballZ_package_facts.py" <<< 42613 1727204603.89969: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmplkojvf_a" to remote "/root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/AnsiballZ_package_facts.py" <<< 42613 1727204603.92098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204603.92139: stderr chunk (state=3): >>><<< 42613 1727204603.92218: stdout chunk (state=3): >>><<< 42613 1727204603.92222: done transferring module to remote 42613 1727204603.92224: _low_level_execute_command(): starting 42613 1727204603.92227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/ /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/AnsiballZ_package_facts.py && sleep 0' 42613 1727204603.93004: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204603.93077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204603.93103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204603.93128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204603.93231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204603.95271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204603.95520: stderr chunk (state=3): >>><<< 42613 1727204603.95525: stdout chunk (state=3): >>><<< 42613 1727204603.95528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204603.95536: _low_level_execute_command(): starting 42613 1727204603.95539: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/AnsiballZ_package_facts.py && sleep 0' 42613 1727204603.96341: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204603.96389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204603.96415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204603.96440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204603.96660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204604.60912: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 42613 1727204604.61133: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 42613 1727204604.61149: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 42613 1727204604.61257: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 42613 1727204604.63345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204604.63349: stdout chunk (state=3): >>><<< 42613 1727204604.63352: stderr chunk (state=3): >>><<< 42613 1727204604.63583: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204604.66331: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204604.66365: _low_level_execute_command(): starting 42613 1727204604.66379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204603.8279858-44786-88387230243924/ > /dev/null 2>&1 && sleep 0' 42613 1727204604.67056: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204604.67079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204604.67093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204604.67209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204604.67488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204604.67572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204604.69693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204604.69918: stderr chunk (state=3): >>><<< 42613 1727204604.69930: stdout chunk (state=3): >>><<< 42613 1727204604.69956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204604.69975: handler run complete 42613 1727204604.71775: variable 'ansible_facts' from source: unknown 42613 1727204604.72964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204604.76541: variable 'ansible_facts' from source: unknown 42613 1727204604.77373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204604.78672: attempt loop complete, returning result 42613 1727204604.78691: _execute() done 42613 1727204604.78695: dumping result to json 42613 1727204604.79424: done dumping result, returning 42613 1727204604.79435: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-2f91-05d8-0000000004f8] 42613 1727204604.79443: sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f8 42613 1727204604.85529: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000004f8 42613 1727204604.85534: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204604.85661: no more pending results, returning what we have 42613 1727204604.85664: results queue empty 42613 1727204604.85668: checking for any_errors_fatal 42613 1727204604.85673: done checking for any_errors_fatal 42613 1727204604.85674: checking for max_fail_percentage 42613 1727204604.85676: done checking for max_fail_percentage 42613 1727204604.85676: checking to see if all hosts have failed and the running result is not ok 42613 1727204604.85677: done checking to see if all hosts have failed 42613 1727204604.85678: getting the remaining hosts for this loop 42613 1727204604.85679: done getting the remaining hosts for this loop 42613 1727204604.85683: getting the next task for host managed-node3 42613 1727204604.85690: done getting next task for host managed-node3 42613 1727204604.85694: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 42613 1727204604.85696: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204604.85706: getting variables 42613 1727204604.85707: in VariableManager get_vars() 42613 1727204604.85741: Calling all_inventory to load vars for managed-node3 42613 1727204604.85744: Calling groups_inventory to load vars for managed-node3 42613 1727204604.85747: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204604.85756: Calling all_plugins_play to load vars for managed-node3 42613 1727204604.85759: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204604.85762: Calling groups_plugins_play to load vars for managed-node3 42613 1727204604.88620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204604.92307: done with get_vars() 42613 1727204604.92353: done getting variables 42613 1727204604.92417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:24 -0400 (0:00:01.158) 0:00:33.532 ***** 42613 1727204604.92451: entering _queue_task() for managed-node3/debug 42613 1727204604.93239: worker is 1 (out of 1 available) 42613 1727204604.93256: exiting _queue_task() for managed-node3/debug 42613 1727204604.93272: done queuing things up, now waiting for results queue to drain 42613 1727204604.93273: waiting for pending results... 42613 1727204604.93988: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 42613 1727204604.94375: in run() - task 127b8e07-fff9-2f91-05d8-000000000072 42613 1727204604.94380: variable 'ansible_search_path' from source: unknown 42613 1727204604.94384: variable 'ansible_search_path' from source: unknown 42613 1727204604.94387: calling self._execute() 42613 1727204604.94773: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204604.94778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204604.94781: variable 'omit' from source: magic vars 42613 1727204604.95442: variable 'ansible_distribution_major_version' from source: facts 42613 1727204604.95463: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204604.95477: variable 'omit' from source: magic vars 42613 1727204604.95529: variable 'omit' from source: magic vars 42613 1727204604.95896: variable 'network_provider' from source: set_fact 42613 1727204604.95922: variable 'omit' from source: magic vars 42613 1727204604.95978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204604.96024: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204604.96471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204604.96475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204604.96477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204604.96480: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204604.96482: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204604.96485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204604.96504: Set connection var ansible_shell_executable to /bin/sh 42613 1727204604.96514: Set connection var ansible_pipelining to False 42613 1727204604.96972: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204604.96976: Set connection var ansible_connection to ssh 42613 1727204604.96978: Set connection var ansible_timeout to 10 42613 1727204604.96981: Set connection var ansible_shell_type to sh 42613 1727204604.96983: variable 'ansible_shell_executable' from source: unknown 42613 1727204604.96986: variable 'ansible_connection' from source: unknown 42613 1727204604.96988: variable 'ansible_module_compression' from source: unknown 42613 1727204604.96990: variable 'ansible_shell_type' from source: unknown 42613 1727204604.96992: variable 'ansible_shell_executable' from source: unknown 42613 1727204604.96994: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204604.96996: variable 'ansible_pipelining' from source: unknown 42613 1727204604.96998: variable 'ansible_timeout' from source: unknown 42613 1727204604.97000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204604.97044: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204604.97068: variable 'omit' from source: magic vars 42613 1727204604.97472: starting attempt loop 42613 1727204604.97476: running the handler 42613 1727204604.97479: handler run complete 42613 1727204604.97482: attempt loop complete, returning result 42613 1727204604.97485: _execute() done 42613 1727204604.97487: dumping result to json 42613 1727204604.97490: done dumping result, returning 42613 1727204604.97493: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-2f91-05d8-000000000072] 42613 1727204604.97496: sending task result for task 127b8e07-fff9-2f91-05d8-000000000072 42613 1727204604.97573: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000072 42613 1727204604.97576: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 42613 1727204604.97646: no more pending results, returning what we have 42613 1727204604.97650: results queue empty 42613 1727204604.97651: checking for any_errors_fatal 42613 1727204604.97666: done checking for any_errors_fatal 42613 1727204604.97669: checking for max_fail_percentage 42613 1727204604.97671: done checking for max_fail_percentage 42613 1727204604.97672: checking to see if all hosts have failed and the running result is not ok 42613 1727204604.97673: done checking to see if all hosts have failed 42613 1727204604.97674: getting the remaining hosts for this loop 42613 1727204604.97676: done getting the remaining hosts for this loop 42613 1727204604.97681: getting the next task for host managed-node3 42613 1727204604.97688: done getting next task for host managed-node3 42613 1727204604.97692: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 42613 1727204604.97694: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204604.97705: getting variables 42613 1727204604.97707: in VariableManager get_vars() 42613 1727204604.97748: Calling all_inventory to load vars for managed-node3 42613 1727204604.97751: Calling groups_inventory to load vars for managed-node3 42613 1727204604.97754: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204604.97870: Calling all_plugins_play to load vars for managed-node3 42613 1727204604.97875: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204604.97879: Calling groups_plugins_play to load vars for managed-node3 42613 1727204605.02015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204605.07506: done with get_vars() 42613 1727204605.07552: done getting variables 42613 1727204605.07727: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.153) 0:00:33.685 ***** 42613 1727204605.07764: entering _queue_task() for managed-node3/fail 42613 1727204605.09006: worker is 1 (out of 1 available) 42613 1727204605.09019: exiting _queue_task() for managed-node3/fail 42613 1727204605.09033: done queuing things up, now waiting for results queue to drain 42613 1727204605.09034: waiting for pending results... 42613 1727204605.09420: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 42613 1727204605.09636: in run() - task 127b8e07-fff9-2f91-05d8-000000000073 42613 1727204605.09782: variable 'ansible_search_path' from source: unknown 42613 1727204605.09792: variable 'ansible_search_path' from source: unknown 42613 1727204605.09836: calling self._execute() 42613 1727204605.10040: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204605.10155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204605.10176: variable 'omit' from source: magic vars 42613 1727204605.11233: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.11258: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204605.11800: variable 'network_state' from source: role '' defaults 42613 1727204605.11871: Evaluated conditional (network_state != {}): False 42613 1727204605.11881: when evaluation is False, skipping this task 42613 1727204605.12078: _execute() done 42613 1727204605.12082: dumping result to json 42613 1727204605.12085: done dumping result, returning 42613 1727204605.12088: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-2f91-05d8-000000000073] 42613 1727204605.12091: sending task result for task 127b8e07-fff9-2f91-05d8-000000000073 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204605.12250: no more pending results, returning what we have 42613 1727204605.12254: results queue empty 42613 1727204605.12256: checking for any_errors_fatal 42613 1727204605.12264: done checking for any_errors_fatal 42613 1727204605.12267: checking for max_fail_percentage 42613 1727204605.12269: done checking for max_fail_percentage 42613 1727204605.12270: checking to see if all hosts have failed and the running result is not ok 42613 1727204605.12271: done checking to see if all hosts have failed 42613 1727204605.12271: getting the remaining hosts for this loop 42613 1727204605.12273: done getting the remaining hosts for this loop 42613 1727204605.12277: getting the next task for host managed-node3 42613 1727204605.12284: done getting next task for host managed-node3 42613 1727204605.12288: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 42613 1727204605.12291: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204605.12308: getting variables 42613 1727204605.12310: in VariableManager get_vars() 42613 1727204605.12352: Calling all_inventory to load vars for managed-node3 42613 1727204605.12355: Calling groups_inventory to load vars for managed-node3 42613 1727204605.12357: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204605.12775: Calling all_plugins_play to load vars for managed-node3 42613 1727204605.12780: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204605.12786: Calling groups_plugins_play to load vars for managed-node3 42613 1727204605.13577: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000073 42613 1727204605.13581: WORKER PROCESS EXITING 42613 1727204605.24612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204605.27192: done with get_vars() 42613 1727204605.27289: done getting variables 42613 1727204605.27359: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.196) 0:00:33.882 ***** 42613 1727204605.27392: entering _queue_task() for managed-node3/fail 42613 1727204605.28043: worker is 1 (out of 1 available) 42613 1727204605.28060: exiting _queue_task() for managed-node3/fail 42613 1727204605.28078: done queuing things up, now waiting for results queue to drain 42613 1727204605.28080: waiting for pending results... 42613 1727204605.28347: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 42613 1727204605.28505: in run() - task 127b8e07-fff9-2f91-05d8-000000000074 42613 1727204605.28519: variable 'ansible_search_path' from source: unknown 42613 1727204605.28522: variable 'ansible_search_path' from source: unknown 42613 1727204605.28570: calling self._execute() 42613 1727204605.28699: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204605.28715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204605.28726: variable 'omit' from source: magic vars 42613 1727204605.29196: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.29209: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204605.29382: variable 'network_state' from source: role '' defaults 42613 1727204605.29393: Evaluated conditional (network_state != {}): False 42613 1727204605.29397: when evaluation is False, skipping this task 42613 1727204605.29400: _execute() done 42613 1727204605.29403: dumping result to json 42613 1727204605.29406: done dumping result, returning 42613 1727204605.29419: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-2f91-05d8-000000000074] 42613 1727204605.29426: sending task result for task 127b8e07-fff9-2f91-05d8-000000000074 42613 1727204605.29548: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000074 42613 1727204605.29552: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204605.29621: no more pending results, returning what we have 42613 1727204605.29627: results queue empty 42613 1727204605.29628: checking for any_errors_fatal 42613 1727204605.29643: done checking for any_errors_fatal 42613 1727204605.29645: checking for max_fail_percentage 42613 1727204605.29648: done checking for max_fail_percentage 42613 1727204605.29650: checking to see if all hosts have failed and the running result is not ok 42613 1727204605.29651: done checking to see if all hosts have failed 42613 1727204605.29652: getting the remaining hosts for this loop 42613 1727204605.29654: done getting the remaining hosts for this loop 42613 1727204605.29659: getting the next task for host managed-node3 42613 1727204605.29669: done getting next task for host managed-node3 42613 1727204605.29674: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 42613 1727204605.29676: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204605.29697: getting variables 42613 1727204605.29699: in VariableManager get_vars() 42613 1727204605.29750: Calling all_inventory to load vars for managed-node3 42613 1727204605.29754: Calling groups_inventory to load vars for managed-node3 42613 1727204605.29756: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204605.29890: Calling all_plugins_play to load vars for managed-node3 42613 1727204605.29895: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204605.29900: Calling groups_plugins_play to load vars for managed-node3 42613 1727204605.31923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204605.34529: done with get_vars() 42613 1727204605.34578: done getting variables 42613 1727204605.34647: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.072) 0:00:33.955 ***** 42613 1727204605.34689: entering _queue_task() for managed-node3/fail 42613 1727204605.35099: worker is 1 (out of 1 available) 42613 1727204605.35227: exiting _queue_task() for managed-node3/fail 42613 1727204605.35242: done queuing things up, now waiting for results queue to drain 42613 1727204605.35243: waiting for pending results... 42613 1727204605.35688: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 42613 1727204605.35699: in run() - task 127b8e07-fff9-2f91-05d8-000000000075 42613 1727204605.35703: variable 'ansible_search_path' from source: unknown 42613 1727204605.35706: variable 'ansible_search_path' from source: unknown 42613 1727204605.35709: calling self._execute() 42613 1727204605.35787: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204605.35791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204605.35802: variable 'omit' from source: magic vars 42613 1727204605.36243: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.36258: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204605.36477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204605.39531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204605.39614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204605.39671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204605.39716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204605.39826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204605.39840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.39876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.39902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.39955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.39971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.40095: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.40114: Evaluated conditional (ansible_distribution_major_version | int > 9): True 42613 1727204605.40269: variable 'ansible_distribution' from source: facts 42613 1727204605.40272: variable '__network_rh_distros' from source: role '' defaults 42613 1727204605.40283: Evaluated conditional (ansible_distribution in __network_rh_distros): False 42613 1727204605.40287: when evaluation is False, skipping this task 42613 1727204605.40290: _execute() done 42613 1727204605.40293: dumping result to json 42613 1727204605.40297: done dumping result, returning 42613 1727204605.40308: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-2f91-05d8-000000000075] 42613 1727204605.40369: sending task result for task 127b8e07-fff9-2f91-05d8-000000000075 42613 1727204605.40447: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000075 42613 1727204605.40450: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 42613 1727204605.40510: no more pending results, returning what we have 42613 1727204605.40513: results queue empty 42613 1727204605.40515: checking for any_errors_fatal 42613 1727204605.40521: done checking for any_errors_fatal 42613 1727204605.40522: checking for max_fail_percentage 42613 1727204605.40524: done checking for max_fail_percentage 42613 1727204605.40525: checking to see if all hosts have failed and the running result is not ok 42613 1727204605.40526: done checking to see if all hosts have failed 42613 1727204605.40526: getting the remaining hosts for this loop 42613 1727204605.40528: done getting the remaining hosts for this loop 42613 1727204605.40533: getting the next task for host managed-node3 42613 1727204605.40543: done getting next task for host managed-node3 42613 1727204605.40548: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 42613 1727204605.40550: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204605.40672: getting variables 42613 1727204605.40674: in VariableManager get_vars() 42613 1727204605.40717: Calling all_inventory to load vars for managed-node3 42613 1727204605.40720: Calling groups_inventory to load vars for managed-node3 42613 1727204605.40722: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204605.40734: Calling all_plugins_play to load vars for managed-node3 42613 1727204605.40736: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204605.40742: Calling groups_plugins_play to load vars for managed-node3 42613 1727204605.42944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204605.45302: done with get_vars() 42613 1727204605.45348: done getting variables 42613 1727204605.45416: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.107) 0:00:34.062 ***** 42613 1727204605.45457: entering _queue_task() for managed-node3/dnf 42613 1727204605.45986: worker is 1 (out of 1 available) 42613 1727204605.46001: exiting _queue_task() for managed-node3/dnf 42613 1727204605.46013: done queuing things up, now waiting for results queue to drain 42613 1727204605.46014: waiting for pending results... 42613 1727204605.46487: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 42613 1727204605.46492: in run() - task 127b8e07-fff9-2f91-05d8-000000000076 42613 1727204605.46495: variable 'ansible_search_path' from source: unknown 42613 1727204605.46498: variable 'ansible_search_path' from source: unknown 42613 1727204605.46501: calling self._execute() 42613 1727204605.46556: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204605.46564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204605.46579: variable 'omit' from source: magic vars 42613 1727204605.47078: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.47178: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204605.47342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204605.50980: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204605.51191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204605.51233: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204605.51322: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204605.51350: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204605.51462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.51502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.51530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.51581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.51603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.51977: variable 'ansible_distribution' from source: facts 42613 1727204605.51981: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.51984: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 42613 1727204605.51987: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204605.52086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.52249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.52282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.52324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.52332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.52664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.52671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.52673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.52676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.52678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.52680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.52682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.52897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.52939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.52956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.53410: variable 'network_connections' from source: play vars 42613 1727204605.53422: variable 'profile' from source: play vars 42613 1727204605.53637: variable 'profile' from source: play vars 42613 1727204605.53640: variable 'interface' from source: set_fact 42613 1727204605.53836: variable 'interface' from source: set_fact 42613 1727204605.54039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204605.54491: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204605.54598: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204605.54748: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204605.54781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204605.54834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204605.54982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204605.55070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.55073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204605.55213: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204605.55857: variable 'network_connections' from source: play vars 42613 1727204605.55862: variable 'profile' from source: play vars 42613 1727204605.56050: variable 'profile' from source: play vars 42613 1727204605.56054: variable 'interface' from source: set_fact 42613 1727204605.56124: variable 'interface' from source: set_fact 42613 1727204605.56270: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204605.56274: when evaluation is False, skipping this task 42613 1727204605.56277: _execute() done 42613 1727204605.56279: dumping result to json 42613 1727204605.56471: done dumping result, returning 42613 1727204605.56475: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-000000000076] 42613 1727204605.56477: sending task result for task 127b8e07-fff9-2f91-05d8-000000000076 42613 1727204605.56556: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000076 42613 1727204605.56558: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204605.56652: no more pending results, returning what we have 42613 1727204605.56656: results queue empty 42613 1727204605.56657: checking for any_errors_fatal 42613 1727204605.56668: done checking for any_errors_fatal 42613 1727204605.56669: checking for max_fail_percentage 42613 1727204605.56671: done checking for max_fail_percentage 42613 1727204605.56673: checking to see if all hosts have failed and the running result is not ok 42613 1727204605.56674: done checking to see if all hosts have failed 42613 1727204605.56675: getting the remaining hosts for this loop 42613 1727204605.56676: done getting the remaining hosts for this loop 42613 1727204605.56681: getting the next task for host managed-node3 42613 1727204605.56689: done getting next task for host managed-node3 42613 1727204605.56694: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 42613 1727204605.56696: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204605.56714: getting variables 42613 1727204605.56716: in VariableManager get_vars() 42613 1727204605.56764: Calling all_inventory to load vars for managed-node3 42613 1727204605.56973: Calling groups_inventory to load vars for managed-node3 42613 1727204605.56977: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204605.56991: Calling all_plugins_play to load vars for managed-node3 42613 1727204605.56994: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204605.56998: Calling groups_plugins_play to load vars for managed-node3 42613 1727204605.61224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204605.64637: done with get_vars() 42613 1727204605.64681: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 42613 1727204605.64779: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.193) 0:00:34.256 ***** 42613 1727204605.64819: entering _queue_task() for managed-node3/yum 42613 1727204605.65381: worker is 1 (out of 1 available) 42613 1727204605.65395: exiting _queue_task() for managed-node3/yum 42613 1727204605.65408: done queuing things up, now waiting for results queue to drain 42613 1727204605.65410: waiting for pending results... 42613 1727204605.65718: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 42613 1727204605.65756: in run() - task 127b8e07-fff9-2f91-05d8-000000000077 42613 1727204605.65774: variable 'ansible_search_path' from source: unknown 42613 1727204605.65778: variable 'ansible_search_path' from source: unknown 42613 1727204605.65821: calling self._execute() 42613 1727204605.66272: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204605.66277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204605.66280: variable 'omit' from source: magic vars 42613 1727204605.66393: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.66413: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204605.66617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204605.69411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204605.69505: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204605.69547: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204605.69586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204605.69689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204605.70080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.70084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.70088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.70091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.70093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.70227: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.70247: Evaluated conditional (ansible_distribution_major_version | int < 8): False 42613 1727204605.70251: when evaluation is False, skipping this task 42613 1727204605.70254: _execute() done 42613 1727204605.70256: dumping result to json 42613 1727204605.70260: done dumping result, returning 42613 1727204605.70273: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-000000000077] 42613 1727204605.70278: sending task result for task 127b8e07-fff9-2f91-05d8-000000000077 42613 1727204605.70651: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000077 42613 1727204605.70654: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 42613 1727204605.70723: no more pending results, returning what we have 42613 1727204605.70727: results queue empty 42613 1727204605.70728: checking for any_errors_fatal 42613 1727204605.70737: done checking for any_errors_fatal 42613 1727204605.70741: checking for max_fail_percentage 42613 1727204605.70744: done checking for max_fail_percentage 42613 1727204605.70745: checking to see if all hosts have failed and the running result is not ok 42613 1727204605.70746: done checking to see if all hosts have failed 42613 1727204605.70747: getting the remaining hosts for this loop 42613 1727204605.70748: done getting the remaining hosts for this loop 42613 1727204605.70753: getting the next task for host managed-node3 42613 1727204605.70761: done getting next task for host managed-node3 42613 1727204605.70767: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 42613 1727204605.70770: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204605.70787: getting variables 42613 1727204605.70789: in VariableManager get_vars() 42613 1727204605.70834: Calling all_inventory to load vars for managed-node3 42613 1727204605.70837: Calling groups_inventory to load vars for managed-node3 42613 1727204605.70841: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204605.70855: Calling all_plugins_play to load vars for managed-node3 42613 1727204605.70858: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204605.70861: Calling groups_plugins_play to load vars for managed-node3 42613 1727204605.74829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204605.77697: done with get_vars() 42613 1727204605.77737: done getting variables 42613 1727204605.77811: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.130) 0:00:34.386 ***** 42613 1727204605.77845: entering _queue_task() for managed-node3/fail 42613 1727204605.78408: worker is 1 (out of 1 available) 42613 1727204605.78423: exiting _queue_task() for managed-node3/fail 42613 1727204605.78436: done queuing things up, now waiting for results queue to drain 42613 1727204605.78438: waiting for pending results... 42613 1727204605.78685: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 42613 1727204605.78785: in run() - task 127b8e07-fff9-2f91-05d8-000000000078 42613 1727204605.78912: variable 'ansible_search_path' from source: unknown 42613 1727204605.78921: variable 'ansible_search_path' from source: unknown 42613 1727204605.78980: calling self._execute() 42613 1727204605.79101: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204605.79115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204605.79141: variable 'omit' from source: magic vars 42613 1727204605.79684: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.79689: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204605.79846: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204605.80136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204605.83473: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204605.83526: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204605.83616: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204605.83681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204605.83716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204605.83905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.83909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.83911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.83943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.83969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.84034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.84068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.84100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.84158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.84179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.84236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.84271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.84304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.84360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.84380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.84594: variable 'network_connections' from source: play vars 42613 1727204605.84667: variable 'profile' from source: play vars 42613 1727204605.84710: variable 'profile' from source: play vars 42613 1727204605.84721: variable 'interface' from source: set_fact 42613 1727204605.84808: variable 'interface' from source: set_fact 42613 1727204605.84912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204605.85143: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204605.85196: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204605.85271: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204605.85348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204605.85398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204605.85527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204605.85541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.85626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204605.85682: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204605.85992: variable 'network_connections' from source: play vars 42613 1727204605.86071: variable 'profile' from source: play vars 42613 1727204605.86085: variable 'profile' from source: play vars 42613 1727204605.86097: variable 'interface' from source: set_fact 42613 1727204605.86167: variable 'interface' from source: set_fact 42613 1727204605.86214: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204605.86222: when evaluation is False, skipping this task 42613 1727204605.86228: _execute() done 42613 1727204605.86242: dumping result to json 42613 1727204605.86251: done dumping result, returning 42613 1727204605.86262: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-000000000078] 42613 1727204605.86285: sending task result for task 127b8e07-fff9-2f91-05d8-000000000078 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204605.86632: no more pending results, returning what we have 42613 1727204605.86637: results queue empty 42613 1727204605.86643: checking for any_errors_fatal 42613 1727204605.86654: done checking for any_errors_fatal 42613 1727204605.86655: checking for max_fail_percentage 42613 1727204605.86657: done checking for max_fail_percentage 42613 1727204605.86659: checking to see if all hosts have failed and the running result is not ok 42613 1727204605.86660: done checking to see if all hosts have failed 42613 1727204605.86661: getting the remaining hosts for this loop 42613 1727204605.86662: done getting the remaining hosts for this loop 42613 1727204605.86670: getting the next task for host managed-node3 42613 1727204605.86678: done getting next task for host managed-node3 42613 1727204605.86683: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 42613 1727204605.86685: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204605.86702: getting variables 42613 1727204605.86704: in VariableManager get_vars() 42613 1727204605.86753: Calling all_inventory to load vars for managed-node3 42613 1727204605.86756: Calling groups_inventory to load vars for managed-node3 42613 1727204605.86759: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204605.86893: Calling all_plugins_play to load vars for managed-node3 42613 1727204605.86898: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204605.87000: Calling groups_plugins_play to load vars for managed-node3 42613 1727204605.87661: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000078 42613 1727204605.87667: WORKER PROCESS EXITING 42613 1727204605.90068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204605.93305: done with get_vars() 42613 1727204605.93346: done getting variables 42613 1727204605.93414: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.156) 0:00:34.542 ***** 42613 1727204605.93453: entering _queue_task() for managed-node3/package 42613 1727204605.93845: worker is 1 (out of 1 available) 42613 1727204605.93859: exiting _queue_task() for managed-node3/package 42613 1727204605.93875: done queuing things up, now waiting for results queue to drain 42613 1727204605.93877: waiting for pending results... 42613 1727204605.94188: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 42613 1727204605.94313: in run() - task 127b8e07-fff9-2f91-05d8-000000000079 42613 1727204605.94333: variable 'ansible_search_path' from source: unknown 42613 1727204605.94340: variable 'ansible_search_path' from source: unknown 42613 1727204605.94386: calling self._execute() 42613 1727204605.94522: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204605.94540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204605.94571: variable 'omit' from source: magic vars 42613 1727204605.94998: variable 'ansible_distribution_major_version' from source: facts 42613 1727204605.95069: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204605.95256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204605.95577: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204605.95640: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204605.95684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204605.95784: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204605.95920: variable 'network_packages' from source: role '' defaults 42613 1727204605.96056: variable '__network_provider_setup' from source: role '' defaults 42613 1727204605.96163: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204605.96166: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204605.96171: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204605.96240: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204605.96461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204605.98839: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204605.98926: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204605.98975: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204605.99022: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204605.99070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204605.99271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.99275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.99278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.99307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.99328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.99397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204605.99434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204605.99469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204605.99529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204605.99615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204605.99802: variable '__network_packages_default_gobject_packages' from source: role '' defaults 42613 1727204605.99936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.00054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.00057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.00060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.00062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.00164: variable 'ansible_python' from source: facts 42613 1727204606.00199: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 42613 1727204606.00301: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204606.00402: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204606.00554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.00587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.00625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.00670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.00689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.00751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.00817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.00821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.00869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.00888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.01271: variable 'network_connections' from source: play vars 42613 1727204606.01274: variable 'profile' from source: play vars 42613 1727204606.01276: variable 'profile' from source: play vars 42613 1727204606.01278: variable 'interface' from source: set_fact 42613 1727204606.01280: variable 'interface' from source: set_fact 42613 1727204606.01333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204606.01363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204606.01404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.01438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204606.01494: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204606.01840: variable 'network_connections' from source: play vars 42613 1727204606.01851: variable 'profile' from source: play vars 42613 1727204606.01968: variable 'profile' from source: play vars 42613 1727204606.01981: variable 'interface' from source: set_fact 42613 1727204606.02062: variable 'interface' from source: set_fact 42613 1727204606.02105: variable '__network_packages_default_wireless' from source: role '' defaults 42613 1727204606.02201: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204606.02579: variable 'network_connections' from source: play vars 42613 1727204606.02595: variable 'profile' from source: play vars 42613 1727204606.02675: variable 'profile' from source: play vars 42613 1727204606.02684: variable 'interface' from source: set_fact 42613 1727204606.02795: variable 'interface' from source: set_fact 42613 1727204606.02841: variable '__network_packages_default_team' from source: role '' defaults 42613 1727204606.02944: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204606.03354: variable 'network_connections' from source: play vars 42613 1727204606.03359: variable 'profile' from source: play vars 42613 1727204606.03361: variable 'profile' from source: play vars 42613 1727204606.03374: variable 'interface' from source: set_fact 42613 1727204606.03572: variable 'interface' from source: set_fact 42613 1727204606.03577: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204606.03635: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204606.03648: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204606.03731: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204606.03986: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 42613 1727204606.04780: variable 'network_connections' from source: play vars 42613 1727204606.04788: variable 'profile' from source: play vars 42613 1727204606.04891: variable 'profile' from source: play vars 42613 1727204606.04896: variable 'interface' from source: set_fact 42613 1727204606.04940: variable 'interface' from source: set_fact 42613 1727204606.04953: variable 'ansible_distribution' from source: facts 42613 1727204606.04960: variable '__network_rh_distros' from source: role '' defaults 42613 1727204606.04973: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.04989: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 42613 1727204606.05217: variable 'ansible_distribution' from source: facts 42613 1727204606.05223: variable '__network_rh_distros' from source: role '' defaults 42613 1727204606.05228: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.05230: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 42613 1727204606.05371: variable 'ansible_distribution' from source: facts 42613 1727204606.05380: variable '__network_rh_distros' from source: role '' defaults 42613 1727204606.05389: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.05427: variable 'network_provider' from source: set_fact 42613 1727204606.05458: variable 'ansible_facts' from source: unknown 42613 1727204606.06490: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 42613 1727204606.06527: when evaluation is False, skipping this task 42613 1727204606.06533: _execute() done 42613 1727204606.06538: dumping result to json 42613 1727204606.06540: done dumping result, returning 42613 1727204606.06543: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-2f91-05d8-000000000079] 42613 1727204606.06572: sending task result for task 127b8e07-fff9-2f91-05d8-000000000079 42613 1727204606.06816: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000079 42613 1727204606.06821: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 42613 1727204606.06886: no more pending results, returning what we have 42613 1727204606.06890: results queue empty 42613 1727204606.06891: checking for any_errors_fatal 42613 1727204606.06901: done checking for any_errors_fatal 42613 1727204606.06902: checking for max_fail_percentage 42613 1727204606.06904: done checking for max_fail_percentage 42613 1727204606.06905: checking to see if all hosts have failed and the running result is not ok 42613 1727204606.06906: done checking to see if all hosts have failed 42613 1727204606.06907: getting the remaining hosts for this loop 42613 1727204606.06908: done getting the remaining hosts for this loop 42613 1727204606.06913: getting the next task for host managed-node3 42613 1727204606.06920: done getting next task for host managed-node3 42613 1727204606.06924: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 42613 1727204606.06926: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204606.06943: getting variables 42613 1727204606.06945: in VariableManager get_vars() 42613 1727204606.07123: Calling all_inventory to load vars for managed-node3 42613 1727204606.07127: Calling groups_inventory to load vars for managed-node3 42613 1727204606.07129: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204606.07146: Calling all_plugins_play to load vars for managed-node3 42613 1727204606.07149: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204606.07152: Calling groups_plugins_play to load vars for managed-node3 42613 1727204606.09212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204606.11976: done with get_vars() 42613 1727204606.12132: done getting variables 42613 1727204606.12196: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.188) 0:00:34.730 ***** 42613 1727204606.12279: entering _queue_task() for managed-node3/package 42613 1727204606.13110: worker is 1 (out of 1 available) 42613 1727204606.13133: exiting _queue_task() for managed-node3/package 42613 1727204606.13145: done queuing things up, now waiting for results queue to drain 42613 1727204606.13147: waiting for pending results... 42613 1727204606.13475: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 42613 1727204606.13575: in run() - task 127b8e07-fff9-2f91-05d8-00000000007a 42613 1727204606.13675: variable 'ansible_search_path' from source: unknown 42613 1727204606.13679: variable 'ansible_search_path' from source: unknown 42613 1727204606.13682: calling self._execute() 42613 1727204606.13781: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204606.13871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204606.13876: variable 'omit' from source: magic vars 42613 1727204606.14289: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.14311: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204606.14470: variable 'network_state' from source: role '' defaults 42613 1727204606.14489: Evaluated conditional (network_state != {}): False 42613 1727204606.14496: when evaluation is False, skipping this task 42613 1727204606.14503: _execute() done 42613 1727204606.14511: dumping result to json 42613 1727204606.14519: done dumping result, returning 42613 1727204606.14534: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-2f91-05d8-00000000007a] 42613 1727204606.14552: sending task result for task 127b8e07-fff9-2f91-05d8-00000000007a 42613 1727204606.14749: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000007a 42613 1727204606.14753: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204606.14814: no more pending results, returning what we have 42613 1727204606.14819: results queue empty 42613 1727204606.14822: checking for any_errors_fatal 42613 1727204606.14832: done checking for any_errors_fatal 42613 1727204606.14833: checking for max_fail_percentage 42613 1727204606.14835: done checking for max_fail_percentage 42613 1727204606.14837: checking to see if all hosts have failed and the running result is not ok 42613 1727204606.14838: done checking to see if all hosts have failed 42613 1727204606.14839: getting the remaining hosts for this loop 42613 1727204606.14840: done getting the remaining hosts for this loop 42613 1727204606.14846: getting the next task for host managed-node3 42613 1727204606.14855: done getting next task for host managed-node3 42613 1727204606.14860: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 42613 1727204606.14863: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204606.14884: getting variables 42613 1727204606.14886: in VariableManager get_vars() 42613 1727204606.14934: Calling all_inventory to load vars for managed-node3 42613 1727204606.14938: Calling groups_inventory to load vars for managed-node3 42613 1727204606.14940: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204606.15299: Calling all_plugins_play to load vars for managed-node3 42613 1727204606.15306: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204606.15310: Calling groups_plugins_play to load vars for managed-node3 42613 1727204606.19913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204606.24486: done with get_vars() 42613 1727204606.24529: done getting variables 42613 1727204606.24715: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.124) 0:00:34.855 ***** 42613 1727204606.24754: entering _queue_task() for managed-node3/package 42613 1727204606.25636: worker is 1 (out of 1 available) 42613 1727204606.25654: exiting _queue_task() for managed-node3/package 42613 1727204606.25670: done queuing things up, now waiting for results queue to drain 42613 1727204606.25671: waiting for pending results... 42613 1727204606.26124: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 42613 1727204606.26245: in run() - task 127b8e07-fff9-2f91-05d8-00000000007b 42613 1727204606.26249: variable 'ansible_search_path' from source: unknown 42613 1727204606.26252: variable 'ansible_search_path' from source: unknown 42613 1727204606.26275: calling self._execute() 42613 1727204606.26399: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204606.26412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204606.26429: variable 'omit' from source: magic vars 42613 1727204606.27198: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.27224: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204606.27634: variable 'network_state' from source: role '' defaults 42613 1727204606.27641: Evaluated conditional (network_state != {}): False 42613 1727204606.27644: when evaluation is False, skipping this task 42613 1727204606.27646: _execute() done 42613 1727204606.27690: dumping result to json 42613 1727204606.27758: done dumping result, returning 42613 1727204606.27777: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-2f91-05d8-00000000007b] 42613 1727204606.27790: sending task result for task 127b8e07-fff9-2f91-05d8-00000000007b 42613 1727204606.27960: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000007b 42613 1727204606.28274: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204606.28332: no more pending results, returning what we have 42613 1727204606.28340: results queue empty 42613 1727204606.28342: checking for any_errors_fatal 42613 1727204606.28351: done checking for any_errors_fatal 42613 1727204606.28352: checking for max_fail_percentage 42613 1727204606.28355: done checking for max_fail_percentage 42613 1727204606.28356: checking to see if all hosts have failed and the running result is not ok 42613 1727204606.28357: done checking to see if all hosts have failed 42613 1727204606.28358: getting the remaining hosts for this loop 42613 1727204606.28360: done getting the remaining hosts for this loop 42613 1727204606.28365: getting the next task for host managed-node3 42613 1727204606.28376: done getting next task for host managed-node3 42613 1727204606.28381: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 42613 1727204606.28384: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204606.28403: getting variables 42613 1727204606.28405: in VariableManager get_vars() 42613 1727204606.28456: Calling all_inventory to load vars for managed-node3 42613 1727204606.28459: Calling groups_inventory to load vars for managed-node3 42613 1727204606.28462: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204606.28677: Calling all_plugins_play to load vars for managed-node3 42613 1727204606.28681: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204606.28685: Calling groups_plugins_play to load vars for managed-node3 42613 1727204606.32988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204606.35320: done with get_vars() 42613 1727204606.35363: done getting variables 42613 1727204606.35436: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.107) 0:00:34.962 ***** 42613 1727204606.35479: entering _queue_task() for managed-node3/service 42613 1727204606.35882: worker is 1 (out of 1 available) 42613 1727204606.35898: exiting _queue_task() for managed-node3/service 42613 1727204606.35912: done queuing things up, now waiting for results queue to drain 42613 1727204606.35913: waiting for pending results... 42613 1727204606.36247: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 42613 1727204606.36387: in run() - task 127b8e07-fff9-2f91-05d8-00000000007c 42613 1727204606.36412: variable 'ansible_search_path' from source: unknown 42613 1727204606.36419: variable 'ansible_search_path' from source: unknown 42613 1727204606.36471: calling self._execute() 42613 1727204606.36613: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204606.36617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204606.36620: variable 'omit' from source: magic vars 42613 1727204606.37045: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.37068: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204606.37273: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204606.37446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204606.40023: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204606.40120: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204606.40163: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204606.40212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204606.40247: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204606.40409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.40413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.40426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.40480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.40502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.40873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.40878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.40881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.41071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.41074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.41077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.41128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.41199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.41430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.41434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.41736: variable 'network_connections' from source: play vars 42613 1727204606.41767: variable 'profile' from source: play vars 42613 1727204606.41856: variable 'profile' from source: play vars 42613 1727204606.41876: variable 'interface' from source: set_fact 42613 1727204606.41950: variable 'interface' from source: set_fact 42613 1727204606.42046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204606.42271: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204606.42325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204606.42367: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204606.42404: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204606.42571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204606.42574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204606.42576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.42578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204606.42609: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204606.42881: variable 'network_connections' from source: play vars 42613 1727204606.42893: variable 'profile' from source: play vars 42613 1727204606.42974: variable 'profile' from source: play vars 42613 1727204606.42982: variable 'interface' from source: set_fact 42613 1727204606.43236: variable 'interface' from source: set_fact 42613 1727204606.43242: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204606.43245: when evaluation is False, skipping this task 42613 1727204606.43248: _execute() done 42613 1727204606.43250: dumping result to json 42613 1727204606.43252: done dumping result, returning 42613 1727204606.43255: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-00000000007c] 42613 1727204606.43268: sending task result for task 127b8e07-fff9-2f91-05d8-00000000007c 42613 1727204606.43680: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000007c 42613 1727204606.43684: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204606.43732: no more pending results, returning what we have 42613 1727204606.43736: results queue empty 42613 1727204606.43737: checking for any_errors_fatal 42613 1727204606.43747: done checking for any_errors_fatal 42613 1727204606.43748: checking for max_fail_percentage 42613 1727204606.43750: done checking for max_fail_percentage 42613 1727204606.43751: checking to see if all hosts have failed and the running result is not ok 42613 1727204606.43753: done checking to see if all hosts have failed 42613 1727204606.43753: getting the remaining hosts for this loop 42613 1727204606.43755: done getting the remaining hosts for this loop 42613 1727204606.43760: getting the next task for host managed-node3 42613 1727204606.43768: done getting next task for host managed-node3 42613 1727204606.43772: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 42613 1727204606.43775: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204606.43790: getting variables 42613 1727204606.43792: in VariableManager get_vars() 42613 1727204606.43834: Calling all_inventory to load vars for managed-node3 42613 1727204606.43837: Calling groups_inventory to load vars for managed-node3 42613 1727204606.43843: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204606.43855: Calling all_plugins_play to load vars for managed-node3 42613 1727204606.43859: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204606.43862: Calling groups_plugins_play to load vars for managed-node3 42613 1727204606.47508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204606.49834: done with get_vars() 42613 1727204606.49878: done getting variables 42613 1727204606.49951: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.145) 0:00:35.107 ***** 42613 1727204606.49988: entering _queue_task() for managed-node3/service 42613 1727204606.50406: worker is 1 (out of 1 available) 42613 1727204606.50423: exiting _queue_task() for managed-node3/service 42613 1727204606.50436: done queuing things up, now waiting for results queue to drain 42613 1727204606.50437: waiting for pending results... 42613 1727204606.50783: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 42613 1727204606.51076: in run() - task 127b8e07-fff9-2f91-05d8-00000000007d 42613 1727204606.51080: variable 'ansible_search_path' from source: unknown 42613 1727204606.51083: variable 'ansible_search_path' from source: unknown 42613 1727204606.51086: calling self._execute() 42613 1727204606.51107: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204606.51120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204606.51147: variable 'omit' from source: magic vars 42613 1727204606.51594: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.51620: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204606.51817: variable 'network_provider' from source: set_fact 42613 1727204606.51830: variable 'network_state' from source: role '' defaults 42613 1727204606.51853: Evaluated conditional (network_provider == "nm" or network_state != {}): True 42613 1727204606.51866: variable 'omit' from source: magic vars 42613 1727204606.51915: variable 'omit' from source: magic vars 42613 1727204606.51961: variable 'network_service_name' from source: role '' defaults 42613 1727204606.52057: variable 'network_service_name' from source: role '' defaults 42613 1727204606.52188: variable '__network_provider_setup' from source: role '' defaults 42613 1727204606.52272: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204606.52277: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204606.52291: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204606.52363: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204606.52641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204606.55189: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204606.55299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204606.55358: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204606.55411: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204606.55451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204606.55552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.55645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.55648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.55676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.55693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.55750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.55864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.55867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.55872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.55875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.56618: variable '__network_packages_default_gobject_packages' from source: role '' defaults 42613 1727204606.56898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.56961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.57250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.57253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.57256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.57437: variable 'ansible_python' from source: facts 42613 1727204606.57586: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 42613 1727204606.57850: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204606.57976: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204606.58335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.58376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.58500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.58556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.58637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.58764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204606.58808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204606.59113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.59117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204606.59119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204606.59451: variable 'network_connections' from source: play vars 42613 1727204606.59705: variable 'profile' from source: play vars 42613 1727204606.59974: variable 'profile' from source: play vars 42613 1727204606.59977: variable 'interface' from source: set_fact 42613 1727204606.60110: variable 'interface' from source: set_fact 42613 1727204606.60517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204606.60889: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204606.61115: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204606.61152: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204606.61205: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204606.61367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204606.61404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204606.61449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204606.61492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204606.61556: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204606.61951: variable 'network_connections' from source: play vars 42613 1727204606.61964: variable 'profile' from source: play vars 42613 1727204606.62137: variable 'profile' from source: play vars 42613 1727204606.62152: variable 'interface' from source: set_fact 42613 1727204606.62228: variable 'interface' from source: set_fact 42613 1727204606.62274: variable '__network_packages_default_wireless' from source: role '' defaults 42613 1727204606.62527: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204606.63043: variable 'network_connections' from source: play vars 42613 1727204606.63090: variable 'profile' from source: play vars 42613 1727204606.63260: variable 'profile' from source: play vars 42613 1727204606.63473: variable 'interface' from source: set_fact 42613 1727204606.63573: variable 'interface' from source: set_fact 42613 1727204606.63612: variable '__network_packages_default_team' from source: role '' defaults 42613 1727204606.63850: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204606.64335: variable 'network_connections' from source: play vars 42613 1727204606.64381: variable 'profile' from source: play vars 42613 1727204606.64474: variable 'profile' from source: play vars 42613 1727204606.64582: variable 'interface' from source: set_fact 42613 1727204606.64741: variable 'interface' from source: set_fact 42613 1727204606.64959: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204606.65032: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204606.65081: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204606.65226: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204606.65682: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 42613 1727204606.66346: variable 'network_connections' from source: play vars 42613 1727204606.66360: variable 'profile' from source: play vars 42613 1727204606.66440: variable 'profile' from source: play vars 42613 1727204606.66454: variable 'interface' from source: set_fact 42613 1727204606.66533: variable 'interface' from source: set_fact 42613 1727204606.66550: variable 'ansible_distribution' from source: facts 42613 1727204606.66574: variable '__network_rh_distros' from source: role '' defaults 42613 1727204606.66581: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.66670: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 42613 1727204606.66809: variable 'ansible_distribution' from source: facts 42613 1727204606.66818: variable '__network_rh_distros' from source: role '' defaults 42613 1727204606.66829: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.66842: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 42613 1727204606.67053: variable 'ansible_distribution' from source: facts 42613 1727204606.67062: variable '__network_rh_distros' from source: role '' defaults 42613 1727204606.67076: variable 'ansible_distribution_major_version' from source: facts 42613 1727204606.67121: variable 'network_provider' from source: set_fact 42613 1727204606.67153: variable 'omit' from source: magic vars 42613 1727204606.67195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204606.67244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204606.67273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204606.67327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204606.67331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204606.67356: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204606.67364: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204606.67376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204606.67502: Set connection var ansible_shell_executable to /bin/sh 42613 1727204606.67546: Set connection var ansible_pipelining to False 42613 1727204606.67549: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204606.67551: Set connection var ansible_connection to ssh 42613 1727204606.67553: Set connection var ansible_timeout to 10 42613 1727204606.67555: Set connection var ansible_shell_type to sh 42613 1727204606.67580: variable 'ansible_shell_executable' from source: unknown 42613 1727204606.67587: variable 'ansible_connection' from source: unknown 42613 1727204606.67594: variable 'ansible_module_compression' from source: unknown 42613 1727204606.67599: variable 'ansible_shell_type' from source: unknown 42613 1727204606.67605: variable 'ansible_shell_executable' from source: unknown 42613 1727204606.67654: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204606.67661: variable 'ansible_pipelining' from source: unknown 42613 1727204606.67664: variable 'ansible_timeout' from source: unknown 42613 1727204606.67668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204606.67748: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204606.67768: variable 'omit' from source: magic vars 42613 1727204606.67779: starting attempt loop 42613 1727204606.67787: running the handler 42613 1727204606.67895: variable 'ansible_facts' from source: unknown 42613 1727204606.69068: _low_level_execute_command(): starting 42613 1727204606.69077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204606.69788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204606.69836: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204606.69897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204606.69916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204606.70162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204606.70368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204606.72214: stdout chunk (state=3): >>>/root <<< 42613 1727204606.72586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204606.72590: stdout chunk (state=3): >>><<< 42613 1727204606.72593: stderr chunk (state=3): >>><<< 42613 1727204606.72596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204606.72598: _low_level_execute_command(): starting 42613 1727204606.72601: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280 `" && echo ansible-tmp-1727204606.72526-44938-145910887471280="` echo /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280 `" ) && sleep 0' 42613 1727204606.73802: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204606.73948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204606.74189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204606.74283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204606.76495: stdout chunk (state=3): >>>ansible-tmp-1727204606.72526-44938-145910887471280=/root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280 <<< 42613 1727204606.76600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204606.76680: stderr chunk (state=3): >>><<< 42613 1727204606.76726: stdout chunk (state=3): >>><<< 42613 1727204606.77172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204606.72526-44938-145910887471280=/root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204606.77176: variable 'ansible_module_compression' from source: unknown 42613 1727204606.77178: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 42613 1727204606.77180: variable 'ansible_facts' from source: unknown 42613 1727204606.77690: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/AnsiballZ_systemd.py 42613 1727204606.78196: Sending initial data 42613 1727204606.78200: Sent initial data (154 bytes) 42613 1727204606.80091: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204606.80390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204606.80611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204606.80686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204606.82472: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 42613 1727204606.82494: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 42613 1727204606.82512: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204606.82605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204606.82707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp7bprfvip /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/AnsiballZ_systemd.py <<< 42613 1727204606.82720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/AnsiballZ_systemd.py" <<< 42613 1727204606.82782: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp7bprfvip" to remote "/root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/AnsiballZ_systemd.py" <<< 42613 1727204606.84794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204606.84828: stderr chunk (state=3): >>><<< 42613 1727204606.84832: stdout chunk (state=3): >>><<< 42613 1727204606.85198: done transferring module to remote 42613 1727204606.85202: _low_level_execute_command(): starting 42613 1727204606.85205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/ /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/AnsiballZ_systemd.py && sleep 0' 42613 1727204606.85983: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204606.86001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204606.86104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204606.88399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204606.88403: stdout chunk (state=3): >>><<< 42613 1727204606.88406: stderr chunk (state=3): >>><<< 42613 1727204606.88575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204606.88580: _low_level_execute_command(): starting 42613 1727204606.88583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/AnsiballZ_systemd.py && sleep 0' 42613 1727204606.89443: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204606.89448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204606.89485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204606.89489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204606.89589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204606.89594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204606.89634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204606.89719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204607.23217: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11857920", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3517595648", "CPUUsageNSec": "3224287000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.service multi-user.target network.target", "After": "dbus-broker.service cloud-init-local.service systemd-journald.socket system.slice dbus.socket sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:00:56 EDT", "StateChangeTimestampMonotonic": "794185509", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 42613 1727204607.25394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204607.25399: stdout chunk (state=3): >>><<< 42613 1727204607.25470: stderr chunk (state=3): >>><<< 42613 1727204607.25478: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11857920", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3517595648", "CPUUsageNSec": "3224287000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.service multi-user.target network.target", "After": "dbus-broker.service cloud-init-local.service systemd-journald.socket system.slice dbus.socket sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:00:56 EDT", "StateChangeTimestampMonotonic": "794185509", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204607.25938: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204607.26036: _low_level_execute_command(): starting 42613 1727204607.26042: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204606.72526-44938-145910887471280/ > /dev/null 2>&1 && sleep 0' 42613 1727204607.27208: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204607.27213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204607.27237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204607.27243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204607.27284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204607.27287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204607.27472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204607.27482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204607.27499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204607.27593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204607.29655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204607.29754: stderr chunk (state=3): >>><<< 42613 1727204607.29800: stdout chunk (state=3): >>><<< 42613 1727204607.29984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204607.29988: handler run complete 42613 1727204607.29991: attempt loop complete, returning result 42613 1727204607.29993: _execute() done 42613 1727204607.30080: dumping result to json 42613 1727204607.30162: done dumping result, returning 42613 1727204607.30371: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-2f91-05d8-00000000007d] 42613 1727204607.30374: sending task result for task 127b8e07-fff9-2f91-05d8-00000000007d 42613 1727204607.31180: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000007d 42613 1727204607.31184: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204607.31250: no more pending results, returning what we have 42613 1727204607.31254: results queue empty 42613 1727204607.31256: checking for any_errors_fatal 42613 1727204607.31266: done checking for any_errors_fatal 42613 1727204607.31268: checking for max_fail_percentage 42613 1727204607.31273: done checking for max_fail_percentage 42613 1727204607.31274: checking to see if all hosts have failed and the running result is not ok 42613 1727204607.31275: done checking to see if all hosts have failed 42613 1727204607.31276: getting the remaining hosts for this loop 42613 1727204607.31278: done getting the remaining hosts for this loop 42613 1727204607.31283: getting the next task for host managed-node3 42613 1727204607.31290: done getting next task for host managed-node3 42613 1727204607.31406: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 42613 1727204607.31410: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204607.31422: getting variables 42613 1727204607.31424: in VariableManager get_vars() 42613 1727204607.31487: Calling all_inventory to load vars for managed-node3 42613 1727204607.31491: Calling groups_inventory to load vars for managed-node3 42613 1727204607.31498: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204607.31510: Calling all_plugins_play to load vars for managed-node3 42613 1727204607.31514: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204607.31581: Calling groups_plugins_play to load vars for managed-node3 42613 1727204607.34116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204607.36903: done with get_vars() 42613 1727204607.36940: done getting variables 42613 1727204607.37008: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.870) 0:00:35.978 ***** 42613 1727204607.37046: entering _queue_task() for managed-node3/service 42613 1727204607.37450: worker is 1 (out of 1 available) 42613 1727204607.37670: exiting _queue_task() for managed-node3/service 42613 1727204607.37682: done queuing things up, now waiting for results queue to drain 42613 1727204607.37683: waiting for pending results... 42613 1727204607.37813: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 42613 1727204607.37929: in run() - task 127b8e07-fff9-2f91-05d8-00000000007e 42613 1727204607.38271: variable 'ansible_search_path' from source: unknown 42613 1727204607.38276: variable 'ansible_search_path' from source: unknown 42613 1727204607.38279: calling self._execute() 42613 1727204607.38283: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204607.38286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204607.38289: variable 'omit' from source: magic vars 42613 1727204607.38557: variable 'ansible_distribution_major_version' from source: facts 42613 1727204607.38570: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204607.38871: variable 'network_provider' from source: set_fact 42613 1727204607.38875: Evaluated conditional (network_provider == "nm"): True 42613 1727204607.38880: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204607.39045: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204607.39512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204607.43773: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204607.43778: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204607.43781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204607.43804: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204607.43833: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204607.43948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204607.43985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204607.44012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204607.44062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204607.44270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204607.44275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204607.44278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204607.44282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204607.44285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204607.44289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204607.44292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204607.44320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204607.44344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204607.44390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204607.44404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204607.44582: variable 'network_connections' from source: play vars 42613 1727204607.44594: variable 'profile' from source: play vars 42613 1727204607.44685: variable 'profile' from source: play vars 42613 1727204607.44689: variable 'interface' from source: set_fact 42613 1727204607.44759: variable 'interface' from source: set_fact 42613 1727204607.44845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204607.45045: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204607.45092: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204607.45124: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204607.45159: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204607.45214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204607.45236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204607.45269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204607.45300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204607.45351: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204607.46128: variable 'network_connections' from source: play vars 42613 1727204607.46133: variable 'profile' from source: play vars 42613 1727204607.46372: variable 'profile' from source: play vars 42613 1727204607.46376: variable 'interface' from source: set_fact 42613 1727204607.46571: variable 'interface' from source: set_fact 42613 1727204607.46575: Evaluated conditional (__network_wpa_supplicant_required): False 42613 1727204607.46577: when evaluation is False, skipping this task 42613 1727204607.46580: _execute() done 42613 1727204607.46591: dumping result to json 42613 1727204607.46594: done dumping result, returning 42613 1727204607.46596: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-2f91-05d8-00000000007e] 42613 1727204607.46597: sending task result for task 127b8e07-fff9-2f91-05d8-00000000007e skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 42613 1727204607.46729: no more pending results, returning what we have 42613 1727204607.46732: results queue empty 42613 1727204607.46734: checking for any_errors_fatal 42613 1727204607.46756: done checking for any_errors_fatal 42613 1727204607.46757: checking for max_fail_percentage 42613 1727204607.46759: done checking for max_fail_percentage 42613 1727204607.46761: checking to see if all hosts have failed and the running result is not ok 42613 1727204607.46762: done checking to see if all hosts have failed 42613 1727204607.46762: getting the remaining hosts for this loop 42613 1727204607.46764: done getting the remaining hosts for this loop 42613 1727204607.46771: getting the next task for host managed-node3 42613 1727204607.46778: done getting next task for host managed-node3 42613 1727204607.46782: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 42613 1727204607.46785: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204607.46803: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000007e 42613 1727204607.46807: WORKER PROCESS EXITING 42613 1727204607.47026: getting variables 42613 1727204607.47029: in VariableManager get_vars() 42613 1727204607.47078: Calling all_inventory to load vars for managed-node3 42613 1727204607.47081: Calling groups_inventory to load vars for managed-node3 42613 1727204607.47084: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204607.47097: Calling all_plugins_play to load vars for managed-node3 42613 1727204607.47100: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204607.47104: Calling groups_plugins_play to load vars for managed-node3 42613 1727204607.51345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204607.56069: done with get_vars() 42613 1727204607.56110: done getting variables 42613 1727204607.56294: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.192) 0:00:36.171 ***** 42613 1727204607.56327: entering _queue_task() for managed-node3/service 42613 1727204607.57112: worker is 1 (out of 1 available) 42613 1727204607.57349: exiting _queue_task() for managed-node3/service 42613 1727204607.57362: done queuing things up, now waiting for results queue to drain 42613 1727204607.57363: waiting for pending results... 42613 1727204607.57777: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 42613 1727204607.58128: in run() - task 127b8e07-fff9-2f91-05d8-00000000007f 42613 1727204607.58143: variable 'ansible_search_path' from source: unknown 42613 1727204607.58147: variable 'ansible_search_path' from source: unknown 42613 1727204607.58573: calling self._execute() 42613 1727204607.58578: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204607.58582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204607.58587: variable 'omit' from source: magic vars 42613 1727204607.59349: variable 'ansible_distribution_major_version' from source: facts 42613 1727204607.59485: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204607.59777: variable 'network_provider' from source: set_fact 42613 1727204607.59784: Evaluated conditional (network_provider == "initscripts"): False 42613 1727204607.59787: when evaluation is False, skipping this task 42613 1727204607.59798: _execute() done 42613 1727204607.59801: dumping result to json 42613 1727204607.59804: done dumping result, returning 42613 1727204607.59815: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-2f91-05d8-00000000007f] 42613 1727204607.59820: sending task result for task 127b8e07-fff9-2f91-05d8-00000000007f 42613 1727204607.60000: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000007f 42613 1727204607.60003: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204607.60056: no more pending results, returning what we have 42613 1727204607.60063: results queue empty 42613 1727204607.60065: checking for any_errors_fatal 42613 1727204607.60084: done checking for any_errors_fatal 42613 1727204607.60085: checking for max_fail_percentage 42613 1727204607.60088: done checking for max_fail_percentage 42613 1727204607.60089: checking to see if all hosts have failed and the running result is not ok 42613 1727204607.60090: done checking to see if all hosts have failed 42613 1727204607.60091: getting the remaining hosts for this loop 42613 1727204607.60093: done getting the remaining hosts for this loop 42613 1727204607.60098: getting the next task for host managed-node3 42613 1727204607.60106: done getting next task for host managed-node3 42613 1727204607.60111: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 42613 1727204607.60114: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204607.60132: getting variables 42613 1727204607.60134: in VariableManager get_vars() 42613 1727204607.60484: Calling all_inventory to load vars for managed-node3 42613 1727204607.60488: Calling groups_inventory to load vars for managed-node3 42613 1727204607.60490: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204607.60502: Calling all_plugins_play to load vars for managed-node3 42613 1727204607.60505: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204607.60508: Calling groups_plugins_play to load vars for managed-node3 42613 1727204607.64866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204607.69652: done with get_vars() 42613 1727204607.69754: done getting variables 42613 1727204607.69822: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.136) 0:00:36.307 ***** 42613 1727204607.69974: entering _queue_task() for managed-node3/copy 42613 1727204607.70702: worker is 1 (out of 1 available) 42613 1727204607.70944: exiting _queue_task() for managed-node3/copy 42613 1727204607.70958: done queuing things up, now waiting for results queue to drain 42613 1727204607.70959: waiting for pending results... 42613 1727204607.71492: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 42613 1727204607.71874: in run() - task 127b8e07-fff9-2f91-05d8-000000000080 42613 1727204607.71879: variable 'ansible_search_path' from source: unknown 42613 1727204607.71883: variable 'ansible_search_path' from source: unknown 42613 1727204607.71886: calling self._execute() 42613 1727204607.72174: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204607.72179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204607.72182: variable 'omit' from source: magic vars 42613 1727204607.72986: variable 'ansible_distribution_major_version' from source: facts 42613 1727204607.72997: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204607.73471: variable 'network_provider' from source: set_fact 42613 1727204607.73475: Evaluated conditional (network_provider == "initscripts"): False 42613 1727204607.73479: when evaluation is False, skipping this task 42613 1727204607.73481: _execute() done 42613 1727204607.73484: dumping result to json 42613 1727204607.73487: done dumping result, returning 42613 1727204607.73490: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-2f91-05d8-000000000080] 42613 1727204607.73493: sending task result for task 127b8e07-fff9-2f91-05d8-000000000080 42613 1727204607.73610: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000080 42613 1727204607.73644: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 42613 1727204607.73705: no more pending results, returning what we have 42613 1727204607.73709: results queue empty 42613 1727204607.73710: checking for any_errors_fatal 42613 1727204607.73719: done checking for any_errors_fatal 42613 1727204607.73719: checking for max_fail_percentage 42613 1727204607.73722: done checking for max_fail_percentage 42613 1727204607.73723: checking to see if all hosts have failed and the running result is not ok 42613 1727204607.73724: done checking to see if all hosts have failed 42613 1727204607.73725: getting the remaining hosts for this loop 42613 1727204607.73727: done getting the remaining hosts for this loop 42613 1727204607.73731: getting the next task for host managed-node3 42613 1727204607.73739: done getting next task for host managed-node3 42613 1727204607.73743: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 42613 1727204607.73745: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204607.73762: getting variables 42613 1727204607.73764: in VariableManager get_vars() 42613 1727204607.73813: Calling all_inventory to load vars for managed-node3 42613 1727204607.73817: Calling groups_inventory to load vars for managed-node3 42613 1727204607.73819: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204607.73834: Calling all_plugins_play to load vars for managed-node3 42613 1727204607.73838: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204607.73842: Calling groups_plugins_play to load vars for managed-node3 42613 1727204607.77950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204607.83008: done with get_vars() 42613 1727204607.83163: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.133) 0:00:36.441 ***** 42613 1727204607.83373: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 42613 1727204607.84116: worker is 1 (out of 1 available) 42613 1727204607.84245: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 42613 1727204607.84260: done queuing things up, now waiting for results queue to drain 42613 1727204607.84261: waiting for pending results... 42613 1727204607.84987: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 42613 1727204607.85100: in run() - task 127b8e07-fff9-2f91-05d8-000000000081 42613 1727204607.85120: variable 'ansible_search_path' from source: unknown 42613 1727204607.85124: variable 'ansible_search_path' from source: unknown 42613 1727204607.85164: calling self._execute() 42613 1727204607.85490: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204607.85496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204607.85500: variable 'omit' from source: magic vars 42613 1727204607.86371: variable 'ansible_distribution_major_version' from source: facts 42613 1727204607.86383: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204607.86390: variable 'omit' from source: magic vars 42613 1727204607.86557: variable 'omit' from source: magic vars 42613 1727204607.87041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204607.92124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204607.92412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204607.92418: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204607.92544: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204607.92572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204607.92780: variable 'network_provider' from source: set_fact 42613 1727204607.93174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204607.93214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204607.93244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204607.93374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204607.93377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204607.93608: variable 'omit' from source: magic vars 42613 1727204607.93787: variable 'omit' from source: magic vars 42613 1727204607.94088: variable 'network_connections' from source: play vars 42613 1727204607.94101: variable 'profile' from source: play vars 42613 1727204607.94298: variable 'profile' from source: play vars 42613 1727204607.94302: variable 'interface' from source: set_fact 42613 1727204607.94480: variable 'interface' from source: set_fact 42613 1727204607.94757: variable 'omit' from source: magic vars 42613 1727204607.94768: variable '__lsr_ansible_managed' from source: task vars 42613 1727204607.95018: variable '__lsr_ansible_managed' from source: task vars 42613 1727204607.95706: Loaded config def from plugin (lookup/template) 42613 1727204607.95710: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 42613 1727204607.95746: File lookup term: get_ansible_managed.j2 42613 1727204607.95750: variable 'ansible_search_path' from source: unknown 42613 1727204607.95753: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 42613 1727204607.95887: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 42613 1727204607.96048: variable 'ansible_search_path' from source: unknown 42613 1727204608.31194: variable 'ansible_managed' from source: unknown 42613 1727204608.31860: variable 'omit' from source: magic vars 42613 1727204608.32173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204608.32178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204608.32180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204608.32182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204608.32201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204608.32231: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204608.32246: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204608.32257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204608.32578: Set connection var ansible_shell_executable to /bin/sh 42613 1727204608.32770: Set connection var ansible_pipelining to False 42613 1727204608.32774: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204608.32776: Set connection var ansible_connection to ssh 42613 1727204608.32778: Set connection var ansible_timeout to 10 42613 1727204608.32780: Set connection var ansible_shell_type to sh 42613 1727204608.32782: variable 'ansible_shell_executable' from source: unknown 42613 1727204608.32784: variable 'ansible_connection' from source: unknown 42613 1727204608.32786: variable 'ansible_module_compression' from source: unknown 42613 1727204608.32787: variable 'ansible_shell_type' from source: unknown 42613 1727204608.32789: variable 'ansible_shell_executable' from source: unknown 42613 1727204608.32791: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204608.32793: variable 'ansible_pipelining' from source: unknown 42613 1727204608.32795: variable 'ansible_timeout' from source: unknown 42613 1727204608.32797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204608.33007: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204608.33035: variable 'omit' from source: magic vars 42613 1727204608.33049: starting attempt loop 42613 1727204608.33056: running the handler 42613 1727204608.33075: _low_level_execute_command(): starting 42613 1727204608.33086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204608.34595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204608.34716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204608.34996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204608.35121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204608.36940: stdout chunk (state=3): >>>/root <<< 42613 1727204608.37143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204608.37147: stdout chunk (state=3): >>><<< 42613 1727204608.37150: stderr chunk (state=3): >>><<< 42613 1727204608.37173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204608.37193: _low_level_execute_command(): starting 42613 1727204608.37203: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685 `" && echo ansible-tmp-1727204608.3718133-45053-11215117556685="` echo /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685 `" ) && sleep 0' 42613 1727204608.38918: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204608.38935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204608.39231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204608.39248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204608.39422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204608.39474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204608.41651: stdout chunk (state=3): >>>ansible-tmp-1727204608.3718133-45053-11215117556685=/root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685 <<< 42613 1727204608.41729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204608.42008: stderr chunk (state=3): >>><<< 42613 1727204608.42372: stdout chunk (state=3): >>><<< 42613 1727204608.42377: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204608.3718133-45053-11215117556685=/root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204608.42379: variable 'ansible_module_compression' from source: unknown 42613 1727204608.42381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 42613 1727204608.42384: variable 'ansible_facts' from source: unknown 42613 1727204608.42696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/AnsiballZ_network_connections.py 42613 1727204608.43129: Sending initial data 42613 1727204608.43134: Sent initial data (167 bytes) 42613 1727204608.44638: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204608.44993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204608.45135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204608.46932: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204608.46999: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204608.47071: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpi_rduz26 /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/AnsiballZ_network_connections.py <<< 42613 1727204608.47086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/AnsiballZ_network_connections.py" <<< 42613 1727204608.47144: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpi_rduz26" to remote "/root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/AnsiballZ_network_connections.py" <<< 42613 1727204608.49297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204608.49302: stdout chunk (state=3): >>><<< 42613 1727204608.49305: stderr chunk (state=3): >>><<< 42613 1727204608.49308: done transferring module to remote 42613 1727204608.49310: _low_level_execute_command(): starting 42613 1727204608.49312: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/ /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/AnsiballZ_network_connections.py && sleep 0' 42613 1727204608.50574: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204608.50580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204608.50660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204608.51119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204608.51123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204608.53123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204608.53209: stderr chunk (state=3): >>><<< 42613 1727204608.53213: stdout chunk (state=3): >>><<< 42613 1727204608.53233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204608.53246: _low_level_execute_command(): starting 42613 1727204608.53257: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/AnsiballZ_network_connections.py && sleep 0' 42613 1727204608.55033: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204608.55037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204608.55187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204608.55204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204608.55370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204608.55599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204608.55747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204608.94273: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 42613 1727204608.96954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204608.96959: stdout chunk (state=3): >>><<< 42613 1727204608.96962: stderr chunk (state=3): >>><<< 42613 1727204608.96988: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204608.97026: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204608.97035: _low_level_execute_command(): starting 42613 1727204608.97043: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204608.3718133-45053-11215117556685/ > /dev/null 2>&1 && sleep 0' 42613 1727204608.97716: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204608.97724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204608.97735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204608.97756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204608.97770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204608.97777: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204608.97787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204608.97800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204608.97810: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204608.97845: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204608.97850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204608.97853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204608.97856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204608.97859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204608.97861: stderr chunk (state=3): >>>debug2: match found <<< 42613 1727204608.97864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204608.97963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204608.97969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204608.97995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204608.98095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204609.00241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204609.00776: stderr chunk (state=3): >>><<< 42613 1727204609.00786: stdout chunk (state=3): >>><<< 42613 1727204609.00789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204609.00792: handler run complete 42613 1727204609.00795: attempt loop complete, returning result 42613 1727204609.00797: _execute() done 42613 1727204609.00799: dumping result to json 42613 1727204609.00801: done dumping result, returning 42613 1727204609.00949: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-2f91-05d8-000000000081] 42613 1727204609.00992: sending task result for task 127b8e07-fff9-2f91-05d8-000000000081 42613 1727204609.01295: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000081 42613 1727204609.01300: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 42613 1727204609.01491: no more pending results, returning what we have 42613 1727204609.01495: results queue empty 42613 1727204609.01496: checking for any_errors_fatal 42613 1727204609.01503: done checking for any_errors_fatal 42613 1727204609.01509: checking for max_fail_percentage 42613 1727204609.01511: done checking for max_fail_percentage 42613 1727204609.01512: checking to see if all hosts have failed and the running result is not ok 42613 1727204609.01513: done checking to see if all hosts have failed 42613 1727204609.01514: getting the remaining hosts for this loop 42613 1727204609.01515: done getting the remaining hosts for this loop 42613 1727204609.01519: getting the next task for host managed-node3 42613 1727204609.01525: done getting next task for host managed-node3 42613 1727204609.01529: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 42613 1727204609.01531: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204609.01541: getting variables 42613 1727204609.01543: in VariableManager get_vars() 42613 1727204609.01696: Calling all_inventory to load vars for managed-node3 42613 1727204609.01699: Calling groups_inventory to load vars for managed-node3 42613 1727204609.01701: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204609.01712: Calling all_plugins_play to load vars for managed-node3 42613 1727204609.01715: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204609.01717: Calling groups_plugins_play to load vars for managed-node3 42613 1727204609.03860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204609.06793: done with get_vars() 42613 1727204609.06839: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:29 -0400 (0:00:01.235) 0:00:37.677 ***** 42613 1727204609.06937: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 42613 1727204609.07456: worker is 1 (out of 1 available) 42613 1727204609.07473: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 42613 1727204609.07485: done queuing things up, now waiting for results queue to drain 42613 1727204609.07486: waiting for pending results... 42613 1727204609.07892: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 42613 1727204609.07899: in run() - task 127b8e07-fff9-2f91-05d8-000000000082 42613 1727204609.07902: variable 'ansible_search_path' from source: unknown 42613 1727204609.07906: variable 'ansible_search_path' from source: unknown 42613 1727204609.07909: calling self._execute() 42613 1727204609.08022: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.08027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.08044: variable 'omit' from source: magic vars 42613 1727204609.08491: variable 'ansible_distribution_major_version' from source: facts 42613 1727204609.08503: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204609.08757: variable 'network_state' from source: role '' defaults 42613 1727204609.08761: Evaluated conditional (network_state != {}): False 42613 1727204609.08764: when evaluation is False, skipping this task 42613 1727204609.08769: _execute() done 42613 1727204609.08773: dumping result to json 42613 1727204609.08775: done dumping result, returning 42613 1727204609.08973: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-2f91-05d8-000000000082] 42613 1727204609.08977: sending task result for task 127b8e07-fff9-2f91-05d8-000000000082 42613 1727204609.09047: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000082 42613 1727204609.09051: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204609.09106: no more pending results, returning what we have 42613 1727204609.09109: results queue empty 42613 1727204609.09110: checking for any_errors_fatal 42613 1727204609.09122: done checking for any_errors_fatal 42613 1727204609.09122: checking for max_fail_percentage 42613 1727204609.09125: done checking for max_fail_percentage 42613 1727204609.09126: checking to see if all hosts have failed and the running result is not ok 42613 1727204609.09127: done checking to see if all hosts have failed 42613 1727204609.09128: getting the remaining hosts for this loop 42613 1727204609.09130: done getting the remaining hosts for this loop 42613 1727204609.09134: getting the next task for host managed-node3 42613 1727204609.09143: done getting next task for host managed-node3 42613 1727204609.09148: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 42613 1727204609.09151: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204609.09169: getting variables 42613 1727204609.09170: in VariableManager get_vars() 42613 1727204609.09208: Calling all_inventory to load vars for managed-node3 42613 1727204609.09211: Calling groups_inventory to load vars for managed-node3 42613 1727204609.09213: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204609.09224: Calling all_plugins_play to load vars for managed-node3 42613 1727204609.09227: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204609.09230: Calling groups_plugins_play to load vars for managed-node3 42613 1727204609.11294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204609.13685: done with get_vars() 42613 1727204609.13730: done getting variables 42613 1727204609.13806: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.069) 0:00:37.746 ***** 42613 1727204609.13844: entering _queue_task() for managed-node3/debug 42613 1727204609.14275: worker is 1 (out of 1 available) 42613 1727204609.14290: exiting _queue_task() for managed-node3/debug 42613 1727204609.14417: done queuing things up, now waiting for results queue to drain 42613 1727204609.14419: waiting for pending results... 42613 1727204609.14659: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 42613 1727204609.14877: in run() - task 127b8e07-fff9-2f91-05d8-000000000083 42613 1727204609.14881: variable 'ansible_search_path' from source: unknown 42613 1727204609.14884: variable 'ansible_search_path' from source: unknown 42613 1727204609.14887: calling self._execute() 42613 1727204609.14972: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.14977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.14980: variable 'omit' from source: magic vars 42613 1727204609.15797: variable 'ansible_distribution_major_version' from source: facts 42613 1727204609.15801: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204609.15804: variable 'omit' from source: magic vars 42613 1727204609.16062: variable 'omit' from source: magic vars 42613 1727204609.16083: variable 'omit' from source: magic vars 42613 1727204609.16130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204609.16292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204609.16312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204609.16369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204609.16379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204609.16510: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204609.16514: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.16517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.16746: Set connection var ansible_shell_executable to /bin/sh 42613 1727204609.16753: Set connection var ansible_pipelining to False 42613 1727204609.16799: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204609.16803: Set connection var ansible_connection to ssh 42613 1727204609.16806: Set connection var ansible_timeout to 10 42613 1727204609.16808: Set connection var ansible_shell_type to sh 42613 1727204609.16810: variable 'ansible_shell_executable' from source: unknown 42613 1727204609.16852: variable 'ansible_connection' from source: unknown 42613 1727204609.16856: variable 'ansible_module_compression' from source: unknown 42613 1727204609.16859: variable 'ansible_shell_type' from source: unknown 42613 1727204609.16861: variable 'ansible_shell_executable' from source: unknown 42613 1727204609.16863: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.16868: variable 'ansible_pipelining' from source: unknown 42613 1727204609.16871: variable 'ansible_timeout' from source: unknown 42613 1727204609.16874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.17020: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204609.17031: variable 'omit' from source: magic vars 42613 1727204609.17037: starting attempt loop 42613 1727204609.17040: running the handler 42613 1727204609.17206: variable '__network_connections_result' from source: set_fact 42613 1727204609.17273: handler run complete 42613 1727204609.17294: attempt loop complete, returning result 42613 1727204609.17297: _execute() done 42613 1727204609.17300: dumping result to json 42613 1727204609.17303: done dumping result, returning 42613 1727204609.17316: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-2f91-05d8-000000000083] 42613 1727204609.17319: sending task result for task 127b8e07-fff9-2f91-05d8-000000000083 42613 1727204609.17433: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000083 42613 1727204609.17436: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 42613 1727204609.17514: no more pending results, returning what we have 42613 1727204609.17519: results queue empty 42613 1727204609.17520: checking for any_errors_fatal 42613 1727204609.17529: done checking for any_errors_fatal 42613 1727204609.17529: checking for max_fail_percentage 42613 1727204609.17532: done checking for max_fail_percentage 42613 1727204609.17533: checking to see if all hosts have failed and the running result is not ok 42613 1727204609.17534: done checking to see if all hosts have failed 42613 1727204609.17534: getting the remaining hosts for this loop 42613 1727204609.17536: done getting the remaining hosts for this loop 42613 1727204609.17543: getting the next task for host managed-node3 42613 1727204609.17555: done getting next task for host managed-node3 42613 1727204609.17561: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 42613 1727204609.17563: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204609.17577: getting variables 42613 1727204609.17579: in VariableManager get_vars() 42613 1727204609.17624: Calling all_inventory to load vars for managed-node3 42613 1727204609.17627: Calling groups_inventory to load vars for managed-node3 42613 1727204609.17630: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204609.17646: Calling all_plugins_play to load vars for managed-node3 42613 1727204609.17650: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204609.17655: Calling groups_plugins_play to load vars for managed-node3 42613 1727204609.20297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204609.36719: done with get_vars() 42613 1727204609.36889: done getting variables 42613 1727204609.37108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.232) 0:00:37.979 ***** 42613 1727204609.37140: entering _queue_task() for managed-node3/debug 42613 1727204609.37927: worker is 1 (out of 1 available) 42613 1727204609.37944: exiting _queue_task() for managed-node3/debug 42613 1727204609.38281: done queuing things up, now waiting for results queue to drain 42613 1727204609.38283: waiting for pending results... 42613 1727204609.39292: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 42613 1727204609.39298: in run() - task 127b8e07-fff9-2f91-05d8-000000000084 42613 1727204609.39301: variable 'ansible_search_path' from source: unknown 42613 1727204609.39304: variable 'ansible_search_path' from source: unknown 42613 1727204609.39307: calling self._execute() 42613 1727204609.39864: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.39872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.39875: variable 'omit' from source: magic vars 42613 1727204609.41494: variable 'ansible_distribution_major_version' from source: facts 42613 1727204609.41498: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204609.41501: variable 'omit' from source: magic vars 42613 1727204609.41503: variable 'omit' from source: magic vars 42613 1727204609.41579: variable 'omit' from source: magic vars 42613 1727204609.41857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204609.41971: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204609.41999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204609.42021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204609.42034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204609.42194: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204609.42198: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.42200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.42637: Set connection var ansible_shell_executable to /bin/sh 42613 1727204609.42644: Set connection var ansible_pipelining to False 42613 1727204609.42647: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204609.42649: Set connection var ansible_connection to ssh 42613 1727204609.42656: Set connection var ansible_timeout to 10 42613 1727204609.42658: Set connection var ansible_shell_type to sh 42613 1727204609.42692: variable 'ansible_shell_executable' from source: unknown 42613 1727204609.42696: variable 'ansible_connection' from source: unknown 42613 1727204609.42699: variable 'ansible_module_compression' from source: unknown 42613 1727204609.42701: variable 'ansible_shell_type' from source: unknown 42613 1727204609.42704: variable 'ansible_shell_executable' from source: unknown 42613 1727204609.43174: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.43178: variable 'ansible_pipelining' from source: unknown 42613 1727204609.43181: variable 'ansible_timeout' from source: unknown 42613 1727204609.43183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.43411: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204609.43718: variable 'omit' from source: magic vars 42613 1727204609.43722: starting attempt loop 42613 1727204609.43724: running the handler 42613 1727204609.43971: variable '__network_connections_result' from source: set_fact 42613 1727204609.44074: variable '__network_connections_result' from source: set_fact 42613 1727204609.44628: handler run complete 42613 1727204609.44655: attempt loop complete, returning result 42613 1727204609.44659: _execute() done 42613 1727204609.44661: dumping result to json 42613 1727204609.44664: done dumping result, returning 42613 1727204609.45273: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-2f91-05d8-000000000084] 42613 1727204609.45277: sending task result for task 127b8e07-fff9-2f91-05d8-000000000084 42613 1727204609.45573: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000084 42613 1727204609.45578: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 42613 1727204609.45678: no more pending results, returning what we have 42613 1727204609.45683: results queue empty 42613 1727204609.45689: checking for any_errors_fatal 42613 1727204609.45701: done checking for any_errors_fatal 42613 1727204609.45702: checking for max_fail_percentage 42613 1727204609.45704: done checking for max_fail_percentage 42613 1727204609.45705: checking to see if all hosts have failed and the running result is not ok 42613 1727204609.45706: done checking to see if all hosts have failed 42613 1727204609.45707: getting the remaining hosts for this loop 42613 1727204609.45709: done getting the remaining hosts for this loop 42613 1727204609.45713: getting the next task for host managed-node3 42613 1727204609.45721: done getting next task for host managed-node3 42613 1727204609.45725: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 42613 1727204609.45727: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204609.45739: getting variables 42613 1727204609.45741: in VariableManager get_vars() 42613 1727204609.46289: Calling all_inventory to load vars for managed-node3 42613 1727204609.46292: Calling groups_inventory to load vars for managed-node3 42613 1727204609.46295: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204609.46308: Calling all_plugins_play to load vars for managed-node3 42613 1727204609.46311: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204609.46315: Calling groups_plugins_play to load vars for managed-node3 42613 1727204609.51490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204609.57432: done with get_vars() 42613 1727204609.57677: done getting variables 42613 1727204609.57747: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.208) 0:00:38.188 ***** 42613 1727204609.57997: entering _queue_task() for managed-node3/debug 42613 1727204609.58608: worker is 1 (out of 1 available) 42613 1727204609.58623: exiting _queue_task() for managed-node3/debug 42613 1727204609.58637: done queuing things up, now waiting for results queue to drain 42613 1727204609.58639: waiting for pending results... 42613 1727204609.59196: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 42613 1727204609.59574: in run() - task 127b8e07-fff9-2f91-05d8-000000000085 42613 1727204609.59579: variable 'ansible_search_path' from source: unknown 42613 1727204609.59583: variable 'ansible_search_path' from source: unknown 42613 1727204609.59763: calling self._execute() 42613 1727204609.60157: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.60162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.60164: variable 'omit' from source: magic vars 42613 1727204609.60926: variable 'ansible_distribution_major_version' from source: facts 42613 1727204609.60943: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204609.61159: variable 'network_state' from source: role '' defaults 42613 1727204609.61163: Evaluated conditional (network_state != {}): False 42613 1727204609.61168: when evaluation is False, skipping this task 42613 1727204609.61172: _execute() done 42613 1727204609.61175: dumping result to json 42613 1727204609.61178: done dumping result, returning 42613 1727204609.61181: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-2f91-05d8-000000000085] 42613 1727204609.61183: sending task result for task 127b8e07-fff9-2f91-05d8-000000000085 skipping: [managed-node3] => { "false_condition": "network_state != {}" } 42613 1727204609.61580: no more pending results, returning what we have 42613 1727204609.61584: results queue empty 42613 1727204609.61586: checking for any_errors_fatal 42613 1727204609.61598: done checking for any_errors_fatal 42613 1727204609.61599: checking for max_fail_percentage 42613 1727204609.61601: done checking for max_fail_percentage 42613 1727204609.61602: checking to see if all hosts have failed and the running result is not ok 42613 1727204609.61603: done checking to see if all hosts have failed 42613 1727204609.61604: getting the remaining hosts for this loop 42613 1727204609.61606: done getting the remaining hosts for this loop 42613 1727204609.61611: getting the next task for host managed-node3 42613 1727204609.61619: done getting next task for host managed-node3 42613 1727204609.61625: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 42613 1727204609.61628: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204609.61644: getting variables 42613 1727204609.61646: in VariableManager get_vars() 42613 1727204609.61697: Calling all_inventory to load vars for managed-node3 42613 1727204609.61701: Calling groups_inventory to load vars for managed-node3 42613 1727204609.61703: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204609.61721: Calling all_plugins_play to load vars for managed-node3 42613 1727204609.61725: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204609.61730: Calling groups_plugins_play to load vars for managed-node3 42613 1727204609.62274: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000085 42613 1727204609.62279: WORKER PROCESS EXITING 42613 1727204609.66105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204609.69431: done with get_vars() 42613 1727204609.69683: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.119) 0:00:38.307 ***** 42613 1727204609.69899: entering _queue_task() for managed-node3/ping 42613 1727204609.70619: worker is 1 (out of 1 available) 42613 1727204609.70633: exiting _queue_task() for managed-node3/ping 42613 1727204609.70647: done queuing things up, now waiting for results queue to drain 42613 1727204609.70649: waiting for pending results... 42613 1727204609.71507: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 42613 1727204609.71798: in run() - task 127b8e07-fff9-2f91-05d8-000000000086 42613 1727204609.71804: variable 'ansible_search_path' from source: unknown 42613 1727204609.71808: variable 'ansible_search_path' from source: unknown 42613 1727204609.71846: calling self._execute() 42613 1727204609.71984: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.72026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.72063: variable 'omit' from source: magic vars 42613 1727204609.72681: variable 'ansible_distribution_major_version' from source: facts 42613 1727204609.72687: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204609.72691: variable 'omit' from source: magic vars 42613 1727204609.72724: variable 'omit' from source: magic vars 42613 1727204609.72813: variable 'omit' from source: magic vars 42613 1727204609.72904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204609.72958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204609.72990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204609.73025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204609.73047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204609.73087: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204609.73095: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.73109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.73478: Set connection var ansible_shell_executable to /bin/sh 42613 1727204609.73496: Set connection var ansible_pipelining to False 42613 1727204609.73517: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204609.73549: Set connection var ansible_connection to ssh 42613 1727204609.73552: Set connection var ansible_timeout to 10 42613 1727204609.73555: Set connection var ansible_shell_type to sh 42613 1727204609.73659: variable 'ansible_shell_executable' from source: unknown 42613 1727204609.73662: variable 'ansible_connection' from source: unknown 42613 1727204609.73765: variable 'ansible_module_compression' from source: unknown 42613 1727204609.73776: variable 'ansible_shell_type' from source: unknown 42613 1727204609.73779: variable 'ansible_shell_executable' from source: unknown 42613 1727204609.73782: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204609.73784: variable 'ansible_pipelining' from source: unknown 42613 1727204609.73787: variable 'ansible_timeout' from source: unknown 42613 1727204609.73789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204609.73854: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204609.73881: variable 'omit' from source: magic vars 42613 1727204609.73885: starting attempt loop 42613 1727204609.73889: running the handler 42613 1727204609.73987: _low_level_execute_command(): starting 42613 1727204609.73990: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204609.74844: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204609.74850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204609.74883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204609.74887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204609.74958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204609.74963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204609.75036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204609.76890: stdout chunk (state=3): >>>/root <<< 42613 1727204609.77058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204609.77062: stdout chunk (state=3): >>><<< 42613 1727204609.77073: stderr chunk (state=3): >>><<< 42613 1727204609.77096: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204609.77108: _low_level_execute_command(): starting 42613 1727204609.77115: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625 `" && echo ansible-tmp-1727204609.77096-45113-11247070184625="` echo /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625 `" ) && sleep 0' 42613 1727204609.77662: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204609.77668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204609.77679: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204609.77682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204609.77726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204609.77729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204609.77819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204609.80175: stdout chunk (state=3): >>>ansible-tmp-1727204609.77096-45113-11247070184625=/root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625 <<< 42613 1727204609.80280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204609.80430: stderr chunk (state=3): >>><<< 42613 1727204609.80433: stdout chunk (state=3): >>><<< 42613 1727204609.80470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204609.77096-45113-11247070184625=/root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204609.80513: variable 'ansible_module_compression' from source: unknown 42613 1727204609.80555: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 42613 1727204609.80608: variable 'ansible_facts' from source: unknown 42613 1727204609.80670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/AnsiballZ_ping.py 42613 1727204609.80794: Sending initial data 42613 1727204609.80797: Sent initial data (150 bytes) 42613 1727204609.81492: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204609.81511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204609.81531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204609.81638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204609.83463: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204609.83563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204609.83697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpcfd_l6fx /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/AnsiballZ_ping.py <<< 42613 1727204609.83701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/AnsiballZ_ping.py" <<< 42613 1727204609.83775: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpcfd_l6fx" to remote "/root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/AnsiballZ_ping.py" <<< 42613 1727204609.84668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204609.84782: stderr chunk (state=3): >>><<< 42613 1727204609.84786: stdout chunk (state=3): >>><<< 42613 1727204609.84806: done transferring module to remote 42613 1727204609.84818: _low_level_execute_command(): starting 42613 1727204609.84823: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/ /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/AnsiballZ_ping.py && sleep 0' 42613 1727204609.85381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204609.85386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204609.85483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204609.87633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204609.87707: stderr chunk (state=3): >>><<< 42613 1727204609.87799: stdout chunk (state=3): >>><<< 42613 1727204609.87847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204609.87901: _low_level_execute_command(): starting 42613 1727204609.87910: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/AnsiballZ_ping.py && sleep 0' 42613 1727204609.88475: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204609.88506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204609.88509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204609.88512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204609.88573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204609.88576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204609.88672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204610.06029: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 42613 1727204610.07515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204610.07581: stderr chunk (state=3): >>><<< 42613 1727204610.07585: stdout chunk (state=3): >>><<< 42613 1727204610.07601: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204610.07624: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204610.07635: _low_level_execute_command(): starting 42613 1727204610.07641: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204609.77096-45113-11247070184625/ > /dev/null 2>&1 && sleep 0' 42613 1727204610.08240: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204610.08247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204610.08259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204610.08279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204610.08294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204610.08382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204610.10593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204610.10597: stderr chunk (state=3): >>><<< 42613 1727204610.10600: stdout chunk (state=3): >>><<< 42613 1727204610.10782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204610.10795: handler run complete 42613 1727204610.10798: attempt loop complete, returning result 42613 1727204610.10800: _execute() done 42613 1727204610.10802: dumping result to json 42613 1727204610.10804: done dumping result, returning 42613 1727204610.10805: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-2f91-05d8-000000000086] 42613 1727204610.10807: sending task result for task 127b8e07-fff9-2f91-05d8-000000000086 42613 1727204610.10890: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000086 42613 1727204610.10893: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 42613 1727204610.10978: no more pending results, returning what we have 42613 1727204610.10981: results queue empty 42613 1727204610.10982: checking for any_errors_fatal 42613 1727204610.10990: done checking for any_errors_fatal 42613 1727204610.10991: checking for max_fail_percentage 42613 1727204610.10993: done checking for max_fail_percentage 42613 1727204610.10994: checking to see if all hosts have failed and the running result is not ok 42613 1727204610.10995: done checking to see if all hosts have failed 42613 1727204610.10995: getting the remaining hosts for this loop 42613 1727204610.10997: done getting the remaining hosts for this loop 42613 1727204610.11000: getting the next task for host managed-node3 42613 1727204610.11008: done getting next task for host managed-node3 42613 1727204610.11010: ^ task is: TASK: meta (role_complete) 42613 1727204610.11012: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204610.11023: getting variables 42613 1727204610.11024: in VariableManager get_vars() 42613 1727204610.11064: Calling all_inventory to load vars for managed-node3 42613 1727204610.11108: Calling groups_inventory to load vars for managed-node3 42613 1727204610.11112: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204610.11123: Calling all_plugins_play to load vars for managed-node3 42613 1727204610.11126: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204610.11129: Calling groups_plugins_play to load vars for managed-node3 42613 1727204610.14283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204610.17742: done with get_vars() 42613 1727204610.17788: done getting variables 42613 1727204610.17943: done queuing things up, now waiting for results queue to drain 42613 1727204610.17946: results queue empty 42613 1727204610.17947: checking for any_errors_fatal 42613 1727204610.17951: done checking for any_errors_fatal 42613 1727204610.17952: checking for max_fail_percentage 42613 1727204610.17953: done checking for max_fail_percentage 42613 1727204610.17954: checking to see if all hosts have failed and the running result is not ok 42613 1727204610.17955: done checking to see if all hosts have failed 42613 1727204610.17956: getting the remaining hosts for this loop 42613 1727204610.17957: done getting the remaining hosts for this loop 42613 1727204610.17960: getting the next task for host managed-node3 42613 1727204610.17964: done getting next task for host managed-node3 42613 1727204610.17968: ^ task is: TASK: meta (flush_handlers) 42613 1727204610.17970: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204610.17973: getting variables 42613 1727204610.17974: in VariableManager get_vars() 42613 1727204610.18068: Calling all_inventory to load vars for managed-node3 42613 1727204610.18071: Calling groups_inventory to load vars for managed-node3 42613 1727204610.18074: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204610.18080: Calling all_plugins_play to load vars for managed-node3 42613 1727204610.18083: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204610.18086: Calling groups_plugins_play to load vars for managed-node3 42613 1727204610.20412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204610.23748: done with get_vars() 42613 1727204610.23819: done getting variables 42613 1727204610.23883: in VariableManager get_vars() 42613 1727204610.23905: Calling all_inventory to load vars for managed-node3 42613 1727204610.23907: Calling groups_inventory to load vars for managed-node3 42613 1727204610.23910: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204610.23915: Calling all_plugins_play to load vars for managed-node3 42613 1727204610.23918: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204610.23920: Calling groups_plugins_play to load vars for managed-node3 42613 1727204610.27404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204610.30517: done with get_vars() 42613 1727204610.30571: done queuing things up, now waiting for results queue to drain 42613 1727204610.30574: results queue empty 42613 1727204610.30575: checking for any_errors_fatal 42613 1727204610.30576: done checking for any_errors_fatal 42613 1727204610.30577: checking for max_fail_percentage 42613 1727204610.30578: done checking for max_fail_percentage 42613 1727204610.30579: checking to see if all hosts have failed and the running result is not ok 42613 1727204610.30580: done checking to see if all hosts have failed 42613 1727204610.30581: getting the remaining hosts for this loop 42613 1727204610.30582: done getting the remaining hosts for this loop 42613 1727204610.30585: getting the next task for host managed-node3 42613 1727204610.30591: done getting next task for host managed-node3 42613 1727204610.30593: ^ task is: TASK: meta (flush_handlers) 42613 1727204610.30595: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204610.30601: getting variables 42613 1727204610.30602: in VariableManager get_vars() 42613 1727204610.30618: Calling all_inventory to load vars for managed-node3 42613 1727204610.30621: Calling groups_inventory to load vars for managed-node3 42613 1727204610.30623: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204610.30629: Calling all_plugins_play to load vars for managed-node3 42613 1727204610.30632: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204610.30635: Calling groups_plugins_play to load vars for managed-node3 42613 1727204610.32328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204610.33929: done with get_vars() 42613 1727204610.33958: done getting variables 42613 1727204610.34025: in VariableManager get_vars() 42613 1727204610.34041: Calling all_inventory to load vars for managed-node3 42613 1727204610.34044: Calling groups_inventory to load vars for managed-node3 42613 1727204610.34046: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204610.34052: Calling all_plugins_play to load vars for managed-node3 42613 1727204610.34054: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204610.34057: Calling groups_plugins_play to load vars for managed-node3 42613 1727204610.35834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204610.39195: done with get_vars() 42613 1727204610.39244: done queuing things up, now waiting for results queue to drain 42613 1727204610.39246: results queue empty 42613 1727204610.39247: checking for any_errors_fatal 42613 1727204610.39248: done checking for any_errors_fatal 42613 1727204610.39249: checking for max_fail_percentage 42613 1727204610.39250: done checking for max_fail_percentage 42613 1727204610.39251: checking to see if all hosts have failed and the running result is not ok 42613 1727204610.39252: done checking to see if all hosts have failed 42613 1727204610.39253: getting the remaining hosts for this loop 42613 1727204610.39254: done getting the remaining hosts for this loop 42613 1727204610.39257: getting the next task for host managed-node3 42613 1727204610.39260: done getting next task for host managed-node3 42613 1727204610.39261: ^ task is: None 42613 1727204610.39263: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204610.39264: done queuing things up, now waiting for results queue to drain 42613 1727204610.39378: results queue empty 42613 1727204610.39381: checking for any_errors_fatal 42613 1727204610.39382: done checking for any_errors_fatal 42613 1727204610.39383: checking for max_fail_percentage 42613 1727204610.39384: done checking for max_fail_percentage 42613 1727204610.39385: checking to see if all hosts have failed and the running result is not ok 42613 1727204610.39386: done checking to see if all hosts have failed 42613 1727204610.39388: getting the next task for host managed-node3 42613 1727204610.39391: done getting next task for host managed-node3 42613 1727204610.39392: ^ task is: None 42613 1727204610.39394: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204610.39725: in VariableManager get_vars() 42613 1727204610.39748: done with get_vars() 42613 1727204610.39756: in VariableManager get_vars() 42613 1727204610.39769: done with get_vars() 42613 1727204610.39774: variable 'omit' from source: magic vars 42613 1727204610.39808: in VariableManager get_vars() 42613 1727204610.39817: done with get_vars() 42613 1727204610.39844: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 42613 1727204610.40213: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 42613 1727204610.40243: getting the remaining hosts for this loop 42613 1727204610.40245: done getting the remaining hosts for this loop 42613 1727204610.40249: getting the next task for host managed-node3 42613 1727204610.40252: done getting next task for host managed-node3 42613 1727204610.40255: ^ task is: TASK: Gathering Facts 42613 1727204610.40257: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204610.40259: getting variables 42613 1727204610.40278: in VariableManager get_vars() 42613 1727204610.40289: Calling all_inventory to load vars for managed-node3 42613 1727204610.40291: Calling groups_inventory to load vars for managed-node3 42613 1727204610.40294: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204610.40300: Calling all_plugins_play to load vars for managed-node3 42613 1727204610.40302: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204610.40305: Calling groups_plugins_play to load vars for managed-node3 42613 1727204610.42196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204610.46451: done with get_vars() 42613 1727204610.46550: done getting variables 42613 1727204610.46677: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.768) 0:00:39.075 ***** 42613 1727204610.46707: entering _queue_task() for managed-node3/gather_facts 42613 1727204610.47932: worker is 1 (out of 1 available) 42613 1727204610.47948: exiting _queue_task() for managed-node3/gather_facts 42613 1727204610.47958: done queuing things up, now waiting for results queue to drain 42613 1727204610.47960: waiting for pending results... 42613 1727204610.49238: running TaskExecutor() for managed-node3/TASK: Gathering Facts 42613 1727204610.49534: in run() - task 127b8e07-fff9-2f91-05d8-00000000057e 42613 1727204610.49539: variable 'ansible_search_path' from source: unknown 42613 1727204610.49543: calling self._execute() 42613 1727204610.49859: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204610.50012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204610.50088: variable 'omit' from source: magic vars 42613 1727204610.51437: variable 'ansible_distribution_major_version' from source: facts 42613 1727204610.51776: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204610.51884: variable 'omit' from source: magic vars 42613 1727204610.51888: variable 'omit' from source: magic vars 42613 1727204610.51956: variable 'omit' from source: magic vars 42613 1727204610.52136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204610.52345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204610.52378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204610.52403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204610.52529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204610.52974: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204610.52979: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204610.52982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204610.53073: Set connection var ansible_shell_executable to /bin/sh 42613 1727204610.53138: Set connection var ansible_pipelining to False 42613 1727204610.53202: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204610.53253: Set connection var ansible_connection to ssh 42613 1727204610.53311: Set connection var ansible_timeout to 10 42613 1727204610.53321: Set connection var ansible_shell_type to sh 42613 1727204610.53356: variable 'ansible_shell_executable' from source: unknown 42613 1727204610.53630: variable 'ansible_connection' from source: unknown 42613 1727204610.53634: variable 'ansible_module_compression' from source: unknown 42613 1727204610.53637: variable 'ansible_shell_type' from source: unknown 42613 1727204610.53640: variable 'ansible_shell_executable' from source: unknown 42613 1727204610.53642: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204610.53644: variable 'ansible_pipelining' from source: unknown 42613 1727204610.53647: variable 'ansible_timeout' from source: unknown 42613 1727204610.53649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204610.54286: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204610.54290: variable 'omit' from source: magic vars 42613 1727204610.54294: starting attempt loop 42613 1727204610.54297: running the handler 42613 1727204610.54299: variable 'ansible_facts' from source: unknown 42613 1727204610.54302: _low_level_execute_command(): starting 42613 1727204610.54304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204610.55738: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204610.55890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204610.55997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204610.57848: stdout chunk (state=3): >>>/root <<< 42613 1727204610.58075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204610.58079: stdout chunk (state=3): >>><<< 42613 1727204610.58082: stderr chunk (state=3): >>><<< 42613 1727204610.58113: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204610.58232: _low_level_execute_command(): starting 42613 1727204610.58240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570 `" && echo ansible-tmp-1727204610.5812085-45150-159155540864570="` echo /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570 `" ) && sleep 0' 42613 1727204610.58856: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204610.58874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204610.58911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204610.58929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204610.58988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204610.59062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204610.59093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204610.59143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204610.59352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204610.61494: stdout chunk (state=3): >>>ansible-tmp-1727204610.5812085-45150-159155540864570=/root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570 <<< 42613 1727204610.61716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204610.61720: stdout chunk (state=3): >>><<< 42613 1727204610.61722: stderr chunk (state=3): >>><<< 42613 1727204610.61873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204610.5812085-45150-159155540864570=/root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204610.61877: variable 'ansible_module_compression' from source: unknown 42613 1727204610.61880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 42613 1727204610.61930: variable 'ansible_facts' from source: unknown 42613 1727204610.62145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/AnsiballZ_setup.py 42613 1727204610.62345: Sending initial data 42613 1727204610.62356: Sent initial data (154 bytes) 42613 1727204610.63041: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204610.63081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204610.63185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204610.63208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204610.63435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204610.63580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204610.65373: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204610.65441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204610.65515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpwgnoyqq3 /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/AnsiballZ_setup.py <<< 42613 1727204610.65529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/AnsiballZ_setup.py" <<< 42613 1727204610.65615: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpwgnoyqq3" to remote "/root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/AnsiballZ_setup.py" <<< 42613 1727204610.68019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204610.68202: stderr chunk (state=3): >>><<< 42613 1727204610.68206: stdout chunk (state=3): >>><<< 42613 1727204610.68208: done transferring module to remote 42613 1727204610.68211: _low_level_execute_command(): starting 42613 1727204610.68213: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/ /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/AnsiballZ_setup.py && sleep 0' 42613 1727204610.69262: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204610.69333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204610.69409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204610.69454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204610.69497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204610.69634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204610.71753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204610.71800: stderr chunk (state=3): >>><<< 42613 1727204610.71811: stdout chunk (state=3): >>><<< 42613 1727204610.71846: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204610.71863: _low_level_execute_command(): starting 42613 1727204610.71881: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/AnsiballZ_setup.py && sleep 0' 42613 1727204610.72664: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204610.72682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204610.72874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204610.72970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204610.73218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204610.73305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204611.44478: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "31", "epoch": "1727204611", "epoch_int": "1727204611", "date": "2024-09-24", "time": "15:03:31", "iso8601_micro": "2024-09-24T19:03:31.043884Z", "iso8601": "2024-09-24T19:03:31Z", "iso8601_basic": "20240924T150331043884", "iso8601_basic_short": "20240924T150331", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3035, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 681, "free": 3035}, "nocache": {"free": 3480, "used": 236}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 948, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303608320, "block_size": 4096, "block_total": 64479564, "block_available": 61353420, "block_used": 3126144, "inode_total": 16384000, "inode_available": 16301441, "inode_used": 82559, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_loadavg": {"1m": 0.52783203125, "5m": 0.61328125, "15m": 0.4208984375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0", "peerethtest0", "ethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f6:13:d9:76:0f:3f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.168.122.250", "broadcast": "192.168.122.255", "netmask": "255.255.255.0", "network": "192.168.122.0", "prefix": "24"}, "ipv6": [{"address": "2001:db8::1:1", "prefix": "64", "scope": "global"}, {"address": "fe80::f413:d9ff:fe76:f3f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ce:18:00:21:fb:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::cc18:ff:fe21:fb93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169", "192.168.122.250"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13", "2001:db8::1:1", "fe80::f413:d9ff:fe76:f3f", "fe80::cc18:ff:fe21:fb93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1", "192.168.122.250"], "ipv6": ["::1", "2001:db8::1:1", "fe80::aa:78ff:fea8:9b13", "fe80::cc18:ff:fe21:fb93", "fe80::f413:d9ff:fe76:f3f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 42613 1727204611.46882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204611.47173: stdout chunk (state=3): >>><<< 42613 1727204611.47178: stderr chunk (state=3): >>><<< 42613 1727204611.47182: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "31", "epoch": "1727204611", "epoch_int": "1727204611", "date": "2024-09-24", "time": "15:03:31", "iso8601_micro": "2024-09-24T19:03:31.043884Z", "iso8601": "2024-09-24T19:03:31Z", "iso8601_basic": "20240924T150331043884", "iso8601_basic_short": "20240924T150331", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3035, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 681, "free": 3035}, "nocache": {"free": 3480, "used": 236}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 948, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303608320, "block_size": 4096, "block_total": 64479564, "block_available": 61353420, "block_used": 3126144, "inode_total": 16384000, "inode_available": 16301441, "inode_used": 82559, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_loadavg": {"1m": 0.52783203125, "5m": 0.61328125, "15m": 0.4208984375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0", "peerethtest0", "ethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f6:13:d9:76:0f:3f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.168.122.250", "broadcast": "192.168.122.255", "netmask": "255.255.255.0", "network": "192.168.122.0", "prefix": "24"}, "ipv6": [{"address": "2001:db8::1:1", "prefix": "64", "scope": "global"}, {"address": "fe80::f413:d9ff:fe76:f3f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ce:18:00:21:fb:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::cc18:ff:fe21:fb93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169", "192.168.122.250"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13", "2001:db8::1:1", "fe80::f413:d9ff:fe76:f3f", "fe80::cc18:ff:fe21:fb93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1", "192.168.122.250"], "ipv6": ["::1", "2001:db8::1:1", "fe80::aa:78ff:fea8:9b13", "fe80::cc18:ff:fe21:fb93", "fe80::f413:d9ff:fe76:f3f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204611.48106: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204611.48260: _low_level_execute_command(): starting 42613 1727204611.48346: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204610.5812085-45150-159155540864570/ > /dev/null 2>&1 && sleep 0' 42613 1727204611.49944: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204611.50189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204611.50291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204611.50305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204611.50674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204611.52762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204611.52934: stderr chunk (state=3): >>><<< 42613 1727204611.52971: stdout chunk (state=3): >>><<< 42613 1727204611.52996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204611.53172: handler run complete 42613 1727204611.53507: variable 'ansible_facts' from source: unknown 42613 1727204611.53872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204611.55075: variable 'ansible_facts' from source: unknown 42613 1727204611.55372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204611.55833: attempt loop complete, returning result 42613 1727204611.55855: _execute() done 42613 1727204611.55960: dumping result to json 42613 1727204611.56013: done dumping result, returning 42613 1727204611.56030: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-2f91-05d8-00000000057e] 42613 1727204611.56043: sending task result for task 127b8e07-fff9-2f91-05d8-00000000057e ok: [managed-node3] 42613 1727204611.58309: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000057e 42613 1727204611.58313: WORKER PROCESS EXITING 42613 1727204611.58482: no more pending results, returning what we have 42613 1727204611.58486: results queue empty 42613 1727204611.58487: checking for any_errors_fatal 42613 1727204611.58489: done checking for any_errors_fatal 42613 1727204611.58490: checking for max_fail_percentage 42613 1727204611.58492: done checking for max_fail_percentage 42613 1727204611.58493: checking to see if all hosts have failed and the running result is not ok 42613 1727204611.58494: done checking to see if all hosts have failed 42613 1727204611.58495: getting the remaining hosts for this loop 42613 1727204611.58497: done getting the remaining hosts for this loop 42613 1727204611.58502: getting the next task for host managed-node3 42613 1727204611.58507: done getting next task for host managed-node3 42613 1727204611.58510: ^ task is: TASK: meta (flush_handlers) 42613 1727204611.58512: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204611.58517: getting variables 42613 1727204611.58519: in VariableManager get_vars() 42613 1727204611.58544: Calling all_inventory to load vars for managed-node3 42613 1727204611.58547: Calling groups_inventory to load vars for managed-node3 42613 1727204611.58550: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204611.58771: Calling all_plugins_play to load vars for managed-node3 42613 1727204611.58778: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204611.58783: Calling groups_plugins_play to load vars for managed-node3 42613 1727204611.64308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204611.69526: done with get_vars() 42613 1727204611.69681: done getting variables 42613 1727204611.69763: in VariableManager get_vars() 42613 1727204611.69835: Calling all_inventory to load vars for managed-node3 42613 1727204611.69838: Calling groups_inventory to load vars for managed-node3 42613 1727204611.69841: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204611.69847: Calling all_plugins_play to load vars for managed-node3 42613 1727204611.69850: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204611.69853: Calling groups_plugins_play to load vars for managed-node3 42613 1727204611.74134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204611.78222: done with get_vars() 42613 1727204611.78448: done queuing things up, now waiting for results queue to drain 42613 1727204611.78450: results queue empty 42613 1727204611.78451: checking for any_errors_fatal 42613 1727204611.78458: done checking for any_errors_fatal 42613 1727204611.78459: checking for max_fail_percentage 42613 1727204611.78460: done checking for max_fail_percentage 42613 1727204611.78461: checking to see if all hosts have failed and the running result is not ok 42613 1727204611.78469: done checking to see if all hosts have failed 42613 1727204611.78470: getting the remaining hosts for this loop 42613 1727204611.78471: done getting the remaining hosts for this loop 42613 1727204611.78475: getting the next task for host managed-node3 42613 1727204611.78480: done getting next task for host managed-node3 42613 1727204611.78483: ^ task is: TASK: Include the task 'delete_interface.yml' 42613 1727204611.78485: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204611.78487: getting variables 42613 1727204611.78488: in VariableManager get_vars() 42613 1727204611.78500: Calling all_inventory to load vars for managed-node3 42613 1727204611.78502: Calling groups_inventory to load vars for managed-node3 42613 1727204611.78505: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204611.78512: Calling all_plugins_play to load vars for managed-node3 42613 1727204611.78514: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204611.78517: Calling groups_plugins_play to load vars for managed-node3 42613 1727204611.80370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204611.82870: done with get_vars() 42613 1727204611.82904: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 15:03:31 -0400 (0:00:01.364) 0:00:40.440 ***** 42613 1727204611.83203: entering _queue_task() for managed-node3/include_tasks 42613 1727204611.83674: worker is 1 (out of 1 available) 42613 1727204611.83688: exiting _queue_task() for managed-node3/include_tasks 42613 1727204611.83702: done queuing things up, now waiting for results queue to drain 42613 1727204611.83704: waiting for pending results... 42613 1727204611.84056: running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' 42613 1727204611.84194: in run() - task 127b8e07-fff9-2f91-05d8-000000000089 42613 1727204611.84224: variable 'ansible_search_path' from source: unknown 42613 1727204611.84275: calling self._execute() 42613 1727204611.84388: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204611.84407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204611.84425: variable 'omit' from source: magic vars 42613 1727204611.84882: variable 'ansible_distribution_major_version' from source: facts 42613 1727204611.84903: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204611.84915: _execute() done 42613 1727204611.84943: dumping result to json 42613 1727204611.84954: done dumping result, returning 42613 1727204611.84968: done running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' [127b8e07-fff9-2f91-05d8-000000000089] 42613 1727204611.85002: sending task result for task 127b8e07-fff9-2f91-05d8-000000000089 42613 1727204611.85247: no more pending results, returning what we have 42613 1727204611.85253: in VariableManager get_vars() 42613 1727204611.85299: Calling all_inventory to load vars for managed-node3 42613 1727204611.85303: Calling groups_inventory to load vars for managed-node3 42613 1727204611.85307: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204611.85325: Calling all_plugins_play to load vars for managed-node3 42613 1727204611.85329: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204611.85333: Calling groups_plugins_play to load vars for managed-node3 42613 1727204611.86102: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000089 42613 1727204611.86106: WORKER PROCESS EXITING 42613 1727204611.88333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204611.90900: done with get_vars() 42613 1727204611.90934: variable 'ansible_search_path' from source: unknown 42613 1727204611.91189: we have included files to process 42613 1727204611.91191: generating all_blocks data 42613 1727204611.91192: done generating all_blocks data 42613 1727204611.91193: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 42613 1727204611.91194: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 42613 1727204611.91197: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 42613 1727204611.91596: done processing included file 42613 1727204611.91599: iterating over new_blocks loaded from include file 42613 1727204611.91600: in VariableManager get_vars() 42613 1727204611.91616: done with get_vars() 42613 1727204611.91618: filtering new block on tags 42613 1727204611.91637: done filtering new block on tags 42613 1727204611.91640: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node3 42613 1727204611.91656: extending task lists for all hosts with included blocks 42613 1727204611.91697: done extending task lists 42613 1727204611.91698: done processing included files 42613 1727204611.91699: results queue empty 42613 1727204611.91700: checking for any_errors_fatal 42613 1727204611.91702: done checking for any_errors_fatal 42613 1727204611.91702: checking for max_fail_percentage 42613 1727204611.91704: done checking for max_fail_percentage 42613 1727204611.91705: checking to see if all hosts have failed and the running result is not ok 42613 1727204611.91705: done checking to see if all hosts have failed 42613 1727204611.91706: getting the remaining hosts for this loop 42613 1727204611.91708: done getting the remaining hosts for this loop 42613 1727204611.91711: getting the next task for host managed-node3 42613 1727204611.91714: done getting next task for host managed-node3 42613 1727204611.91717: ^ task is: TASK: Remove test interface if necessary 42613 1727204611.91720: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204611.91723: getting variables 42613 1727204611.91724: in VariableManager get_vars() 42613 1727204611.91734: Calling all_inventory to load vars for managed-node3 42613 1727204611.91737: Calling groups_inventory to load vars for managed-node3 42613 1727204611.91739: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204611.91745: Calling all_plugins_play to load vars for managed-node3 42613 1727204611.91747: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204611.91750: Calling groups_plugins_play to load vars for managed-node3 42613 1727204611.93987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204611.96972: done with get_vars() 42613 1727204611.97012: done getting variables 42613 1727204611.97070: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.139) 0:00:40.579 ***** 42613 1727204611.97107: entering _queue_task() for managed-node3/command 42613 1727204611.97522: worker is 1 (out of 1 available) 42613 1727204611.97537: exiting _queue_task() for managed-node3/command 42613 1727204611.97553: done queuing things up, now waiting for results queue to drain 42613 1727204611.97555: waiting for pending results... 42613 1727204611.97992: running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary 42613 1727204611.98029: in run() - task 127b8e07-fff9-2f91-05d8-00000000058f 42613 1727204611.98058: variable 'ansible_search_path' from source: unknown 42613 1727204611.98068: variable 'ansible_search_path' from source: unknown 42613 1727204611.98119: calling self._execute() 42613 1727204611.98235: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204611.98251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204611.98270: variable 'omit' from source: magic vars 42613 1727204611.98720: variable 'ansible_distribution_major_version' from source: facts 42613 1727204611.98744: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204611.98755: variable 'omit' from source: magic vars 42613 1727204611.98803: variable 'omit' from source: magic vars 42613 1727204611.98917: variable 'interface' from source: set_fact 42613 1727204611.98942: variable 'omit' from source: magic vars 42613 1727204611.98998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204611.99048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204611.99081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204611.99107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204611.99125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204611.99168: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204611.99180: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204611.99187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204611.99318: Set connection var ansible_shell_executable to /bin/sh 42613 1727204611.99392: Set connection var ansible_pipelining to False 42613 1727204611.99396: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204611.99398: Set connection var ansible_connection to ssh 42613 1727204611.99400: Set connection var ansible_timeout to 10 42613 1727204611.99402: Set connection var ansible_shell_type to sh 42613 1727204611.99407: variable 'ansible_shell_executable' from source: unknown 42613 1727204611.99410: variable 'ansible_connection' from source: unknown 42613 1727204611.99412: variable 'ansible_module_compression' from source: unknown 42613 1727204611.99414: variable 'ansible_shell_type' from source: unknown 42613 1727204611.99416: variable 'ansible_shell_executable' from source: unknown 42613 1727204611.99425: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204611.99434: variable 'ansible_pipelining' from source: unknown 42613 1727204611.99443: variable 'ansible_timeout' from source: unknown 42613 1727204611.99453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204611.99624: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204611.99644: variable 'omit' from source: magic vars 42613 1727204611.99719: starting attempt loop 42613 1727204611.99722: running the handler 42613 1727204611.99724: _low_level_execute_command(): starting 42613 1727204611.99726: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204612.00511: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204612.00550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.00611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204612.00625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204612.00722: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.00736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.00757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.00862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.02720: stdout chunk (state=3): >>>/root <<< 42613 1727204612.02945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.02950: stdout chunk (state=3): >>><<< 42613 1727204612.02952: stderr chunk (state=3): >>><<< 42613 1727204612.03100: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204612.03104: _low_level_execute_command(): starting 42613 1727204612.03107: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218 `" && echo ansible-tmp-1727204612.0299077-45221-90296891281218="` echo /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218 `" ) && sleep 0' 42613 1727204612.03768: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204612.03782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.03790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204612.03806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204612.03818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204612.03875: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204612.03879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.03881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204612.03888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204612.03891: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204612.03893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.03930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.03975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.03981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.04001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.04114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.06305: stdout chunk (state=3): >>>ansible-tmp-1727204612.0299077-45221-90296891281218=/root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218 <<< 42613 1727204612.06496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.06538: stderr chunk (state=3): >>><<< 42613 1727204612.06556: stdout chunk (state=3): >>><<< 42613 1727204612.06582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204612.0299077-45221-90296891281218=/root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204612.06629: variable 'ansible_module_compression' from source: unknown 42613 1727204612.06685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204612.06738: variable 'ansible_facts' from source: unknown 42613 1727204612.06926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/AnsiballZ_command.py 42613 1727204612.07075: Sending initial data 42613 1727204612.07079: Sent initial data (155 bytes) 42613 1727204612.07971: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204612.08132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.08276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.08305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.08422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.10243: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204612.10322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204612.10399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpamg9q89n /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/AnsiballZ_command.py <<< 42613 1727204612.10402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/AnsiballZ_command.py" <<< 42613 1727204612.10469: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpamg9q89n" to remote "/root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/AnsiballZ_command.py" <<< 42613 1727204612.11499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.11512: stdout chunk (state=3): >>><<< 42613 1727204612.11566: stderr chunk (state=3): >>><<< 42613 1727204612.11570: done transferring module to remote 42613 1727204612.11591: _low_level_execute_command(): starting 42613 1727204612.11605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/ /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/AnsiballZ_command.py && sleep 0' 42613 1727204612.12728: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.12780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.12808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.12830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.12950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.15130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.15135: stdout chunk (state=3): >>><<< 42613 1727204612.15137: stderr chunk (state=3): >>><<< 42613 1727204612.15258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204612.15262: _low_level_execute_command(): starting 42613 1727204612.15268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/AnsiballZ_command.py && sleep 0' 42613 1727204612.15896: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204612.15918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.16035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.16063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.16481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.34985: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:03:32.337786", "end": "2024-09-24 15:03:32.345295", "delta": "0:00:00.007509", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204612.38351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204612.38415: stderr chunk (state=3): >>><<< 42613 1727204612.38419: stdout chunk (state=3): >>><<< 42613 1727204612.38437: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:03:32.337786", "end": "2024-09-24 15:03:32.345295", "delta": "0:00:00.007509", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204612.38480: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204612.38496: _low_level_execute_command(): starting 42613 1727204612.38500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204612.0299077-45221-90296891281218/ > /dev/null 2>&1 && sleep 0' 42613 1727204612.39013: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204612.39017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.39020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204612.39024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.39080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.39084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.39095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.39190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.41807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.41871: stderr chunk (state=3): >>><<< 42613 1727204612.41875: stdout chunk (state=3): >>><<< 42613 1727204612.41891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204612.41897: handler run complete 42613 1727204612.41916: Evaluated conditional (False): False 42613 1727204612.41924: attempt loop complete, returning result 42613 1727204612.41927: _execute() done 42613 1727204612.41930: dumping result to json 42613 1727204612.41935: done dumping result, returning 42613 1727204612.41945: done running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary [127b8e07-fff9-2f91-05d8-00000000058f] 42613 1727204612.41952: sending task result for task 127b8e07-fff9-2f91-05d8-00000000058f 42613 1727204612.42062: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000058f 42613 1727204612.42067: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.007509", "end": "2024-09-24 15:03:32.345295", "rc": 0, "start": "2024-09-24 15:03:32.337786" } 42613 1727204612.42134: no more pending results, returning what we have 42613 1727204612.42138: results queue empty 42613 1727204612.42141: checking for any_errors_fatal 42613 1727204612.42143: done checking for any_errors_fatal 42613 1727204612.42143: checking for max_fail_percentage 42613 1727204612.42147: done checking for max_fail_percentage 42613 1727204612.42148: checking to see if all hosts have failed and the running result is not ok 42613 1727204612.42149: done checking to see if all hosts have failed 42613 1727204612.42150: getting the remaining hosts for this loop 42613 1727204612.42151: done getting the remaining hosts for this loop 42613 1727204612.42155: getting the next task for host managed-node3 42613 1727204612.42173: done getting next task for host managed-node3 42613 1727204612.42177: ^ task is: TASK: meta (flush_handlers) 42613 1727204612.42179: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204612.42184: getting variables 42613 1727204612.42185: in VariableManager get_vars() 42613 1727204612.42216: Calling all_inventory to load vars for managed-node3 42613 1727204612.42218: Calling groups_inventory to load vars for managed-node3 42613 1727204612.42222: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204612.42234: Calling all_plugins_play to load vars for managed-node3 42613 1727204612.42237: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204612.42242: Calling groups_plugins_play to load vars for managed-node3 42613 1727204612.43321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204612.44581: done with get_vars() 42613 1727204612.44610: done getting variables 42613 1727204612.44673: in VariableManager get_vars() 42613 1727204612.44683: Calling all_inventory to load vars for managed-node3 42613 1727204612.44685: Calling groups_inventory to load vars for managed-node3 42613 1727204612.44687: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204612.44691: Calling all_plugins_play to load vars for managed-node3 42613 1727204612.44693: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204612.44695: Calling groups_plugins_play to load vars for managed-node3 42613 1727204612.45661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204612.46899: done with get_vars() 42613 1727204612.46932: done queuing things up, now waiting for results queue to drain 42613 1727204612.46935: results queue empty 42613 1727204612.46935: checking for any_errors_fatal 42613 1727204612.46940: done checking for any_errors_fatal 42613 1727204612.46941: checking for max_fail_percentage 42613 1727204612.46942: done checking for max_fail_percentage 42613 1727204612.46943: checking to see if all hosts have failed and the running result is not ok 42613 1727204612.46944: done checking to see if all hosts have failed 42613 1727204612.46944: getting the remaining hosts for this loop 42613 1727204612.46945: done getting the remaining hosts for this loop 42613 1727204612.46947: getting the next task for host managed-node3 42613 1727204612.46950: done getting next task for host managed-node3 42613 1727204612.46952: ^ task is: TASK: meta (flush_handlers) 42613 1727204612.46953: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204612.46955: getting variables 42613 1727204612.46956: in VariableManager get_vars() 42613 1727204612.46963: Calling all_inventory to load vars for managed-node3 42613 1727204612.46967: Calling groups_inventory to load vars for managed-node3 42613 1727204612.46968: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204612.46974: Calling all_plugins_play to load vars for managed-node3 42613 1727204612.46976: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204612.46977: Calling groups_plugins_play to load vars for managed-node3 42613 1727204612.52004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204612.53242: done with get_vars() 42613 1727204612.53270: done getting variables 42613 1727204612.53312: in VariableManager get_vars() 42613 1727204612.53320: Calling all_inventory to load vars for managed-node3 42613 1727204612.53322: Calling groups_inventory to load vars for managed-node3 42613 1727204612.53324: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204612.53328: Calling all_plugins_play to load vars for managed-node3 42613 1727204612.53330: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204612.53332: Calling groups_plugins_play to load vars for managed-node3 42613 1727204612.54218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204612.55448: done with get_vars() 42613 1727204612.55481: done queuing things up, now waiting for results queue to drain 42613 1727204612.55483: results queue empty 42613 1727204612.55484: checking for any_errors_fatal 42613 1727204612.55485: done checking for any_errors_fatal 42613 1727204612.55485: checking for max_fail_percentage 42613 1727204612.55486: done checking for max_fail_percentage 42613 1727204612.55486: checking to see if all hosts have failed and the running result is not ok 42613 1727204612.55487: done checking to see if all hosts have failed 42613 1727204612.55488: getting the remaining hosts for this loop 42613 1727204612.55488: done getting the remaining hosts for this loop 42613 1727204612.55490: getting the next task for host managed-node3 42613 1727204612.55494: done getting next task for host managed-node3 42613 1727204612.55494: ^ task is: None 42613 1727204612.55496: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204612.55497: done queuing things up, now waiting for results queue to drain 42613 1727204612.55498: results queue empty 42613 1727204612.55498: checking for any_errors_fatal 42613 1727204612.55498: done checking for any_errors_fatal 42613 1727204612.55499: checking for max_fail_percentage 42613 1727204612.55499: done checking for max_fail_percentage 42613 1727204612.55500: checking to see if all hosts have failed and the running result is not ok 42613 1727204612.55500: done checking to see if all hosts have failed 42613 1727204612.55501: getting the next task for host managed-node3 42613 1727204612.55503: done getting next task for host managed-node3 42613 1727204612.55503: ^ task is: None 42613 1727204612.55504: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204612.55533: in VariableManager get_vars() 42613 1727204612.55551: done with get_vars() 42613 1727204612.55556: in VariableManager get_vars() 42613 1727204612.55564: done with get_vars() 42613 1727204612.55571: variable 'omit' from source: magic vars 42613 1727204612.55650: variable 'profile' from source: play vars 42613 1727204612.55719: in VariableManager get_vars() 42613 1727204612.55731: done with get_vars() 42613 1727204612.55748: variable 'omit' from source: magic vars 42613 1727204612.55795: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 42613 1727204612.56289: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 42613 1727204612.56312: getting the remaining hosts for this loop 42613 1727204612.56313: done getting the remaining hosts for this loop 42613 1727204612.56315: getting the next task for host managed-node3 42613 1727204612.56317: done getting next task for host managed-node3 42613 1727204612.56319: ^ task is: TASK: Gathering Facts 42613 1727204612.56320: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204612.56322: getting variables 42613 1727204612.56322: in VariableManager get_vars() 42613 1727204612.56331: Calling all_inventory to load vars for managed-node3 42613 1727204612.56333: Calling groups_inventory to load vars for managed-node3 42613 1727204612.56334: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204612.56341: Calling all_plugins_play to load vars for managed-node3 42613 1727204612.56343: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204612.56345: Calling groups_plugins_play to load vars for managed-node3 42613 1727204612.57357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204612.58582: done with get_vars() 42613 1727204612.58607: done getting variables 42613 1727204612.58652: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.615) 0:00:41.194 ***** 42613 1727204612.58675: entering _queue_task() for managed-node3/gather_facts 42613 1727204612.58969: worker is 1 (out of 1 available) 42613 1727204612.58982: exiting _queue_task() for managed-node3/gather_facts 42613 1727204612.58995: done queuing things up, now waiting for results queue to drain 42613 1727204612.58997: waiting for pending results... 42613 1727204612.59190: running TaskExecutor() for managed-node3/TASK: Gathering Facts 42613 1727204612.59269: in run() - task 127b8e07-fff9-2f91-05d8-00000000059d 42613 1727204612.59282: variable 'ansible_search_path' from source: unknown 42613 1727204612.59315: calling self._execute() 42613 1727204612.59404: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204612.59412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204612.59423: variable 'omit' from source: magic vars 42613 1727204612.59738: variable 'ansible_distribution_major_version' from source: facts 42613 1727204612.59748: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204612.59755: variable 'omit' from source: magic vars 42613 1727204612.59784: variable 'omit' from source: magic vars 42613 1727204612.59812: variable 'omit' from source: magic vars 42613 1727204612.59850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204612.59886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204612.59904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204612.59920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204612.59930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204612.59959: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204612.59962: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204612.59967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204612.60053: Set connection var ansible_shell_executable to /bin/sh 42613 1727204612.60057: Set connection var ansible_pipelining to False 42613 1727204612.60066: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204612.60070: Set connection var ansible_connection to ssh 42613 1727204612.60075: Set connection var ansible_timeout to 10 42613 1727204612.60078: Set connection var ansible_shell_type to sh 42613 1727204612.60101: variable 'ansible_shell_executable' from source: unknown 42613 1727204612.60107: variable 'ansible_connection' from source: unknown 42613 1727204612.60110: variable 'ansible_module_compression' from source: unknown 42613 1727204612.60113: variable 'ansible_shell_type' from source: unknown 42613 1727204612.60115: variable 'ansible_shell_executable' from source: unknown 42613 1727204612.60118: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204612.60120: variable 'ansible_pipelining' from source: unknown 42613 1727204612.60123: variable 'ansible_timeout' from source: unknown 42613 1727204612.60125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204612.60273: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204612.60284: variable 'omit' from source: magic vars 42613 1727204612.60290: starting attempt loop 42613 1727204612.60294: running the handler 42613 1727204612.60306: variable 'ansible_facts' from source: unknown 42613 1727204612.60326: _low_level_execute_command(): starting 42613 1727204612.60332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204612.60909: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204612.60915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.60919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.60983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.60988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.60991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.61063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.62904: stdout chunk (state=3): >>>/root <<< 42613 1727204612.63001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.63073: stderr chunk (state=3): >>><<< 42613 1727204612.63077: stdout chunk (state=3): >>><<< 42613 1727204612.63101: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204612.63111: _low_level_execute_command(): starting 42613 1727204612.63117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172 `" && echo ansible-tmp-1727204612.6309814-45290-264145681925172="` echo /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172 `" ) && sleep 0' 42613 1727204612.63622: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.63626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.63630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204612.63644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.63684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.63687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.63691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.63789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.65940: stdout chunk (state=3): >>>ansible-tmp-1727204612.6309814-45290-264145681925172=/root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172 <<< 42613 1727204612.66058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.66121: stderr chunk (state=3): >>><<< 42613 1727204612.66126: stdout chunk (state=3): >>><<< 42613 1727204612.66144: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204612.6309814-45290-264145681925172=/root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204612.66180: variable 'ansible_module_compression' from source: unknown 42613 1727204612.66224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 42613 1727204612.66286: variable 'ansible_facts' from source: unknown 42613 1727204612.66420: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/AnsiballZ_setup.py 42613 1727204612.66958: Sending initial data 42613 1727204612.66962: Sent initial data (154 bytes) 42613 1727204612.67240: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204612.67262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.67380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.67403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.67423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.67529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.69315: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 42613 1727204612.69344: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204612.69438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204612.69542: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp7dyzabap /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/AnsiballZ_setup.py <<< 42613 1727204612.69554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/AnsiballZ_setup.py" <<< 42613 1727204612.69603: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp7dyzabap" to remote "/root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/AnsiballZ_setup.py" <<< 42613 1727204612.71125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.71204: stderr chunk (state=3): >>><<< 42613 1727204612.71208: stdout chunk (state=3): >>><<< 42613 1727204612.71232: done transferring module to remote 42613 1727204612.71245: _low_level_execute_command(): starting 42613 1727204612.71250: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/ /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/AnsiballZ_setup.py && sleep 0' 42613 1727204612.71759: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204612.71763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.71774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204612.71777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.71826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.71840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.71904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204612.73911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204612.73977: stderr chunk (state=3): >>><<< 42613 1727204612.73981: stdout chunk (state=3): >>><<< 42613 1727204612.73995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204612.73998: _low_level_execute_command(): starting 42613 1727204612.74003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/AnsiballZ_setup.py && sleep 0' 42613 1727204612.74507: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204612.74511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.74514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204612.74517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204612.74577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204612.74580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204612.74585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204612.74663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204613.42500: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_iscsi_iqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload"<<< 42613 1727204613.42520: stdout chunk (state=3): >>>: "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3033, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 683, "free": 3033}, "nocache": {"free": 3478, "used": 238}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 950, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303591936, "block_size": 4096, "block_total": 64479564, "block_available": 61353416, "block_used": 3126148, "inode_total": 16384000, "inode_available": 16301441, "inode_used": 82559, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI<<< 42613 1727204613.42528: stdout chunk (state=3): >>>+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "33", "epoch": "1727204613", "epoch_int": "1727204613", "date": "2024-09-24", "time": "15:03:33", "iso8601_micro": "2024-09-24T19:03:33.421226Z", "iso8601": "2024-09-24T19:03:33Z", "iso8601_basic": "20240924T150333421226", "iso8601_basic_short": "20240924T150333", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.56591796875, "5m": 0.6201171875, "15m": 0.42431640625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 42613 1727204613.44795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204613.44858: stderr chunk (state=3): >>><<< 42613 1727204613.44861: stdout chunk (state=3): >>><<< 42613 1727204613.44897: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_iscsi_iqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3033, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 683, "free": 3033}, "nocache": {"free": 3478, "used": 238}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 950, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303591936, "block_size": 4096, "block_total": 64479564, "block_available": 61353416, "block_used": 3126148, "inode_total": 16384000, "inode_available": 16301441, "inode_used": 82559, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "33", "epoch": "1727204613", "epoch_int": "1727204613", "date": "2024-09-24", "time": "15:03:33", "iso8601_micro": "2024-09-24T19:03:33.421226Z", "iso8601": "2024-09-24T19:03:33Z", "iso8601_basic": "20240924T150333421226", "iso8601_basic_short": "20240924T150333", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.56591796875, "5m": 0.6201171875, "15m": 0.42431640625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204613.45176: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204613.45196: _low_level_execute_command(): starting 42613 1727204613.45202: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204612.6309814-45290-264145681925172/ > /dev/null 2>&1 && sleep 0' 42613 1727204613.45712: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204613.45716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204613.45719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204613.45773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204613.45778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204613.45788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204613.45856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204613.47902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204613.47965: stderr chunk (state=3): >>><<< 42613 1727204613.47969: stdout chunk (state=3): >>><<< 42613 1727204613.47987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204613.47995: handler run complete 42613 1727204613.48098: variable 'ansible_facts' from source: unknown 42613 1727204613.48185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.48403: variable 'ansible_facts' from source: unknown 42613 1727204613.48470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.48564: attempt loop complete, returning result 42613 1727204613.48571: _execute() done 42613 1727204613.48574: dumping result to json 42613 1727204613.48596: done dumping result, returning 42613 1727204613.48606: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-2f91-05d8-00000000059d] 42613 1727204613.48612: sending task result for task 127b8e07-fff9-2f91-05d8-00000000059d 42613 1727204613.48947: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000059d 42613 1727204613.48950: WORKER PROCESS EXITING ok: [managed-node3] 42613 1727204613.49216: no more pending results, returning what we have 42613 1727204613.49219: results queue empty 42613 1727204613.49220: checking for any_errors_fatal 42613 1727204613.49222: done checking for any_errors_fatal 42613 1727204613.49222: checking for max_fail_percentage 42613 1727204613.49224: done checking for max_fail_percentage 42613 1727204613.49224: checking to see if all hosts have failed and the running result is not ok 42613 1727204613.49225: done checking to see if all hosts have failed 42613 1727204613.49225: getting the remaining hosts for this loop 42613 1727204613.49227: done getting the remaining hosts for this loop 42613 1727204613.49229: getting the next task for host managed-node3 42613 1727204613.49233: done getting next task for host managed-node3 42613 1727204613.49235: ^ task is: TASK: meta (flush_handlers) 42613 1727204613.49236: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204613.49241: getting variables 42613 1727204613.49242: in VariableManager get_vars() 42613 1727204613.49272: Calling all_inventory to load vars for managed-node3 42613 1727204613.49275: Calling groups_inventory to load vars for managed-node3 42613 1727204613.49286: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.49298: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.49301: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.49304: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.50398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.51650: done with get_vars() 42613 1727204613.51683: done getting variables 42613 1727204613.51743: in VariableManager get_vars() 42613 1727204613.51754: Calling all_inventory to load vars for managed-node3 42613 1727204613.51755: Calling groups_inventory to load vars for managed-node3 42613 1727204613.51757: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.51761: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.51762: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.51764: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.52660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.54035: done with get_vars() 42613 1727204613.54067: done queuing things up, now waiting for results queue to drain 42613 1727204613.54069: results queue empty 42613 1727204613.54070: checking for any_errors_fatal 42613 1727204613.54073: done checking for any_errors_fatal 42613 1727204613.54074: checking for max_fail_percentage 42613 1727204613.54074: done checking for max_fail_percentage 42613 1727204613.54079: checking to see if all hosts have failed and the running result is not ok 42613 1727204613.54080: done checking to see if all hosts have failed 42613 1727204613.54080: getting the remaining hosts for this loop 42613 1727204613.54081: done getting the remaining hosts for this loop 42613 1727204613.54083: getting the next task for host managed-node3 42613 1727204613.54086: done getting next task for host managed-node3 42613 1727204613.54089: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 42613 1727204613.54090: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204613.54099: getting variables 42613 1727204613.54100: in VariableManager get_vars() 42613 1727204613.54110: Calling all_inventory to load vars for managed-node3 42613 1727204613.54112: Calling groups_inventory to load vars for managed-node3 42613 1727204613.54113: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.54118: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.54119: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.54121: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.55223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.57549: done with get_vars() 42613 1727204613.57589: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.990) 0:00:42.185 ***** 42613 1727204613.57713: entering _queue_task() for managed-node3/include_tasks 42613 1727204613.58124: worker is 1 (out of 1 available) 42613 1727204613.58142: exiting _queue_task() for managed-node3/include_tasks 42613 1727204613.58156: done queuing things up, now waiting for results queue to drain 42613 1727204613.58158: waiting for pending results... 42613 1727204613.58701: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 42613 1727204613.58707: in run() - task 127b8e07-fff9-2f91-05d8-000000000091 42613 1727204613.58711: variable 'ansible_search_path' from source: unknown 42613 1727204613.58714: variable 'ansible_search_path' from source: unknown 42613 1727204613.58717: calling self._execute() 42613 1727204613.58834: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204613.58851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204613.58868: variable 'omit' from source: magic vars 42613 1727204613.59316: variable 'ansible_distribution_major_version' from source: facts 42613 1727204613.59347: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204613.59364: _execute() done 42613 1727204613.59372: dumping result to json 42613 1727204613.59375: done dumping result, returning 42613 1727204613.59381: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-2f91-05d8-000000000091] 42613 1727204613.59387: sending task result for task 127b8e07-fff9-2f91-05d8-000000000091 42613 1727204613.59500: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000091 42613 1727204613.59503: WORKER PROCESS EXITING 42613 1727204613.59548: no more pending results, returning what we have 42613 1727204613.59554: in VariableManager get_vars() 42613 1727204613.59604: Calling all_inventory to load vars for managed-node3 42613 1727204613.59607: Calling groups_inventory to load vars for managed-node3 42613 1727204613.59609: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.59625: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.59627: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.59630: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.60849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.62475: done with get_vars() 42613 1727204613.62508: variable 'ansible_search_path' from source: unknown 42613 1727204613.62509: variable 'ansible_search_path' from source: unknown 42613 1727204613.62547: we have included files to process 42613 1727204613.62551: generating all_blocks data 42613 1727204613.62553: done generating all_blocks data 42613 1727204613.62554: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204613.62555: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204613.62557: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 42613 1727204613.63001: done processing included file 42613 1727204613.63002: iterating over new_blocks loaded from include file 42613 1727204613.63003: in VariableManager get_vars() 42613 1727204613.63020: done with get_vars() 42613 1727204613.63021: filtering new block on tags 42613 1727204613.63033: done filtering new block on tags 42613 1727204613.63035: in VariableManager get_vars() 42613 1727204613.63052: done with get_vars() 42613 1727204613.63054: filtering new block on tags 42613 1727204613.63070: done filtering new block on tags 42613 1727204613.63072: in VariableManager get_vars() 42613 1727204613.63085: done with get_vars() 42613 1727204613.63086: filtering new block on tags 42613 1727204613.63096: done filtering new block on tags 42613 1727204613.63097: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 42613 1727204613.63102: extending task lists for all hosts with included blocks 42613 1727204613.63348: done extending task lists 42613 1727204613.63350: done processing included files 42613 1727204613.63350: results queue empty 42613 1727204613.63351: checking for any_errors_fatal 42613 1727204613.63352: done checking for any_errors_fatal 42613 1727204613.63352: checking for max_fail_percentage 42613 1727204613.63353: done checking for max_fail_percentage 42613 1727204613.63354: checking to see if all hosts have failed and the running result is not ok 42613 1727204613.63354: done checking to see if all hosts have failed 42613 1727204613.63355: getting the remaining hosts for this loop 42613 1727204613.63356: done getting the remaining hosts for this loop 42613 1727204613.63357: getting the next task for host managed-node3 42613 1727204613.63360: done getting next task for host managed-node3 42613 1727204613.63362: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 42613 1727204613.63364: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204613.63373: getting variables 42613 1727204613.63374: in VariableManager get_vars() 42613 1727204613.63386: Calling all_inventory to load vars for managed-node3 42613 1727204613.63389: Calling groups_inventory to load vars for managed-node3 42613 1727204613.63391: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.63395: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.63397: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.63399: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.64414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.65637: done with get_vars() 42613 1727204613.65673: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.080) 0:00:42.265 ***** 42613 1727204613.65736: entering _queue_task() for managed-node3/setup 42613 1727204613.66043: worker is 1 (out of 1 available) 42613 1727204613.66058: exiting _queue_task() for managed-node3/setup 42613 1727204613.66072: done queuing things up, now waiting for results queue to drain 42613 1727204613.66074: waiting for pending results... 42613 1727204613.66269: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 42613 1727204613.66370: in run() - task 127b8e07-fff9-2f91-05d8-0000000005de 42613 1727204613.66382: variable 'ansible_search_path' from source: unknown 42613 1727204613.66386: variable 'ansible_search_path' from source: unknown 42613 1727204613.66423: calling self._execute() 42613 1727204613.66502: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204613.66507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204613.66518: variable 'omit' from source: magic vars 42613 1727204613.66826: variable 'ansible_distribution_major_version' from source: facts 42613 1727204613.66840: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204613.67021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204613.68973: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204613.68977: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204613.68980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204613.68982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204613.69000: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204613.69092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204613.69127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204613.69156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204613.69202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204613.69220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204613.69285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204613.69310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204613.69338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204613.69383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204613.69398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204613.69582: variable '__network_required_facts' from source: role '' defaults 42613 1727204613.69596: variable 'ansible_facts' from source: unknown 42613 1727204613.70486: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 42613 1727204613.70495: when evaluation is False, skipping this task 42613 1727204613.70501: _execute() done 42613 1727204613.70507: dumping result to json 42613 1727204613.70513: done dumping result, returning 42613 1727204613.70526: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-2f91-05d8-0000000005de] 42613 1727204613.70537: sending task result for task 127b8e07-fff9-2f91-05d8-0000000005de skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204613.70738: no more pending results, returning what we have 42613 1727204613.70744: results queue empty 42613 1727204613.70745: checking for any_errors_fatal 42613 1727204613.70747: done checking for any_errors_fatal 42613 1727204613.70747: checking for max_fail_percentage 42613 1727204613.70750: done checking for max_fail_percentage 42613 1727204613.70751: checking to see if all hosts have failed and the running result is not ok 42613 1727204613.70752: done checking to see if all hosts have failed 42613 1727204613.70752: getting the remaining hosts for this loop 42613 1727204613.70754: done getting the remaining hosts for this loop 42613 1727204613.70758: getting the next task for host managed-node3 42613 1727204613.70771: done getting next task for host managed-node3 42613 1727204613.70775: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 42613 1727204613.70778: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204613.70794: getting variables 42613 1727204613.70797: in VariableManager get_vars() 42613 1727204613.70836: Calling all_inventory to load vars for managed-node3 42613 1727204613.70841: Calling groups_inventory to load vars for managed-node3 42613 1727204613.70844: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.70855: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.70858: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.70861: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.71688: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000005de 42613 1727204613.71693: WORKER PROCESS EXITING 42613 1727204613.72951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.75196: done with get_vars() 42613 1727204613.75240: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.096) 0:00:42.361 ***** 42613 1727204613.75358: entering _queue_task() for managed-node3/stat 42613 1727204613.75733: worker is 1 (out of 1 available) 42613 1727204613.75749: exiting _queue_task() for managed-node3/stat 42613 1727204613.75764: done queuing things up, now waiting for results queue to drain 42613 1727204613.75767: waiting for pending results... 42613 1727204613.76042: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 42613 1727204613.76202: in run() - task 127b8e07-fff9-2f91-05d8-0000000005e0 42613 1727204613.76233: variable 'ansible_search_path' from source: unknown 42613 1727204613.76237: variable 'ansible_search_path' from source: unknown 42613 1727204613.76262: calling self._execute() 42613 1727204613.76342: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204613.76353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204613.76362: variable 'omit' from source: magic vars 42613 1727204613.76687: variable 'ansible_distribution_major_version' from source: facts 42613 1727204613.76697: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204613.76833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204613.77061: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204613.77102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204613.77134: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204613.77168: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204613.77241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204613.77265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204613.77285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204613.77304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204613.77381: variable '__network_is_ostree' from source: set_fact 42613 1727204613.77388: Evaluated conditional (not __network_is_ostree is defined): False 42613 1727204613.77392: when evaluation is False, skipping this task 42613 1727204613.77394: _execute() done 42613 1727204613.77397: dumping result to json 42613 1727204613.77401: done dumping result, returning 42613 1727204613.77410: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-2f91-05d8-0000000005e0] 42613 1727204613.77415: sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e0 42613 1727204613.77516: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e0 42613 1727204613.77519: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 42613 1727204613.77577: no more pending results, returning what we have 42613 1727204613.77580: results queue empty 42613 1727204613.77582: checking for any_errors_fatal 42613 1727204613.77589: done checking for any_errors_fatal 42613 1727204613.77590: checking for max_fail_percentage 42613 1727204613.77592: done checking for max_fail_percentage 42613 1727204613.77593: checking to see if all hosts have failed and the running result is not ok 42613 1727204613.77594: done checking to see if all hosts have failed 42613 1727204613.77595: getting the remaining hosts for this loop 42613 1727204613.77596: done getting the remaining hosts for this loop 42613 1727204613.77600: getting the next task for host managed-node3 42613 1727204613.77608: done getting next task for host managed-node3 42613 1727204613.77611: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 42613 1727204613.77614: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204613.77633: getting variables 42613 1727204613.77634: in VariableManager get_vars() 42613 1727204613.77682: Calling all_inventory to load vars for managed-node3 42613 1727204613.77685: Calling groups_inventory to load vars for managed-node3 42613 1727204613.77687: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.77697: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.77700: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.77703: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.78914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.80152: done with get_vars() 42613 1727204613.80187: done getting variables 42613 1727204613.80237: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.049) 0:00:42.410 ***** 42613 1727204613.80270: entering _queue_task() for managed-node3/set_fact 42613 1727204613.80560: worker is 1 (out of 1 available) 42613 1727204613.80577: exiting _queue_task() for managed-node3/set_fact 42613 1727204613.80590: done queuing things up, now waiting for results queue to drain 42613 1727204613.80592: waiting for pending results... 42613 1727204613.80797: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 42613 1727204613.80912: in run() - task 127b8e07-fff9-2f91-05d8-0000000005e1 42613 1727204613.80924: variable 'ansible_search_path' from source: unknown 42613 1727204613.80928: variable 'ansible_search_path' from source: unknown 42613 1727204613.80964: calling self._execute() 42613 1727204613.81058: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204613.81064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204613.81075: variable 'omit' from source: magic vars 42613 1727204613.81386: variable 'ansible_distribution_major_version' from source: facts 42613 1727204613.81397: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204613.81532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204613.81751: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204613.81788: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204613.81820: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204613.81851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204613.81925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204613.81947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204613.81968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204613.81988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204613.82071: variable '__network_is_ostree' from source: set_fact 42613 1727204613.82075: Evaluated conditional (not __network_is_ostree is defined): False 42613 1727204613.82078: when evaluation is False, skipping this task 42613 1727204613.82082: _execute() done 42613 1727204613.82085: dumping result to json 42613 1727204613.82087: done dumping result, returning 42613 1727204613.82096: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-2f91-05d8-0000000005e1] 42613 1727204613.82101: sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e1 42613 1727204613.82199: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e1 42613 1727204613.82202: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 42613 1727204613.82252: no more pending results, returning what we have 42613 1727204613.82256: results queue empty 42613 1727204613.82257: checking for any_errors_fatal 42613 1727204613.82267: done checking for any_errors_fatal 42613 1727204613.82268: checking for max_fail_percentage 42613 1727204613.82270: done checking for max_fail_percentage 42613 1727204613.82271: checking to see if all hosts have failed and the running result is not ok 42613 1727204613.82273: done checking to see if all hosts have failed 42613 1727204613.82273: getting the remaining hosts for this loop 42613 1727204613.82275: done getting the remaining hosts for this loop 42613 1727204613.82280: getting the next task for host managed-node3 42613 1727204613.82290: done getting next task for host managed-node3 42613 1727204613.82294: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 42613 1727204613.82297: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204613.82311: getting variables 42613 1727204613.82313: in VariableManager get_vars() 42613 1727204613.82354: Calling all_inventory to load vars for managed-node3 42613 1727204613.82357: Calling groups_inventory to load vars for managed-node3 42613 1727204613.82359: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204613.82377: Calling all_plugins_play to load vars for managed-node3 42613 1727204613.82380: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204613.82384: Calling groups_plugins_play to load vars for managed-node3 42613 1727204613.83577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204613.84800: done with get_vars() 42613 1727204613.84830: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.046) 0:00:42.457 ***** 42613 1727204613.84915: entering _queue_task() for managed-node3/service_facts 42613 1727204613.85208: worker is 1 (out of 1 available) 42613 1727204613.85224: exiting _queue_task() for managed-node3/service_facts 42613 1727204613.85236: done queuing things up, now waiting for results queue to drain 42613 1727204613.85238: waiting for pending results... 42613 1727204613.85441: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 42613 1727204613.85540: in run() - task 127b8e07-fff9-2f91-05d8-0000000005e3 42613 1727204613.85556: variable 'ansible_search_path' from source: unknown 42613 1727204613.85560: variable 'ansible_search_path' from source: unknown 42613 1727204613.85596: calling self._execute() 42613 1727204613.85679: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204613.85683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204613.85697: variable 'omit' from source: magic vars 42613 1727204613.86006: variable 'ansible_distribution_major_version' from source: facts 42613 1727204613.86023: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204613.86026: variable 'omit' from source: magic vars 42613 1727204613.86075: variable 'omit' from source: magic vars 42613 1727204613.86102: variable 'omit' from source: magic vars 42613 1727204613.86144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204613.86178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204613.86197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204613.86211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204613.86221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204613.86253: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204613.86256: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204613.86260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204613.86339: Set connection var ansible_shell_executable to /bin/sh 42613 1727204613.86352: Set connection var ansible_pipelining to False 42613 1727204613.86355: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204613.86358: Set connection var ansible_connection to ssh 42613 1727204613.86363: Set connection var ansible_timeout to 10 42613 1727204613.86367: Set connection var ansible_shell_type to sh 42613 1727204613.86387: variable 'ansible_shell_executable' from source: unknown 42613 1727204613.86390: variable 'ansible_connection' from source: unknown 42613 1727204613.86393: variable 'ansible_module_compression' from source: unknown 42613 1727204613.86396: variable 'ansible_shell_type' from source: unknown 42613 1727204613.86398: variable 'ansible_shell_executable' from source: unknown 42613 1727204613.86400: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204613.86405: variable 'ansible_pipelining' from source: unknown 42613 1727204613.86408: variable 'ansible_timeout' from source: unknown 42613 1727204613.86412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204613.86579: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204613.86589: variable 'omit' from source: magic vars 42613 1727204613.86594: starting attempt loop 42613 1727204613.86597: running the handler 42613 1727204613.86609: _low_level_execute_command(): starting 42613 1727204613.86616: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204613.87179: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204613.87185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204613.87189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204613.87236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204613.87257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204613.87330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204613.89183: stdout chunk (state=3): >>>/root <<< 42613 1727204613.89293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204613.89352: stderr chunk (state=3): >>><<< 42613 1727204613.89356: stdout chunk (state=3): >>><<< 42613 1727204613.89379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204613.89394: _low_level_execute_command(): starting 42613 1727204613.89403: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956 `" && echo ansible-tmp-1727204613.893809-45343-1692720177956="` echo /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956 `" ) && sleep 0' 42613 1727204613.90204: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204613.90208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204613.90211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204613.90276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204613.90302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204613.90350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204613.90456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204613.92635: stdout chunk (state=3): >>>ansible-tmp-1727204613.893809-45343-1692720177956=/root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956 <<< 42613 1727204613.92737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204613.92807: stderr chunk (state=3): >>><<< 42613 1727204613.92811: stdout chunk (state=3): >>><<< 42613 1727204613.92827: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204613.893809-45343-1692720177956=/root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204613.92873: variable 'ansible_module_compression' from source: unknown 42613 1727204613.92916: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 42613 1727204613.92955: variable 'ansible_facts' from source: unknown 42613 1727204613.93022: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/AnsiballZ_service_facts.py 42613 1727204613.93143: Sending initial data 42613 1727204613.93146: Sent initial data (159 bytes) 42613 1727204613.93674: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204613.93678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204613.93681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204613.93683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204613.93737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204613.93741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204613.93745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204613.93826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204613.95659: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204613.95789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204613.95839: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpklyv2ih6 /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/AnsiballZ_service_facts.py <<< 42613 1727204613.95844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/AnsiballZ_service_facts.py" <<< 42613 1727204613.95940: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpklyv2ih6" to remote "/root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/AnsiballZ_service_facts.py" <<< 42613 1727204613.96909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204613.96970: stderr chunk (state=3): >>><<< 42613 1727204613.96974: stdout chunk (state=3): >>><<< 42613 1727204613.97075: done transferring module to remote 42613 1727204613.97079: _low_level_execute_command(): starting 42613 1727204613.97081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/ /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/AnsiballZ_service_facts.py && sleep 0' 42613 1727204613.97797: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204613.97873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204613.97936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204613.97974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204613.98025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204613.98117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204614.00205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204614.00262: stderr chunk (state=3): >>><<< 42613 1727204614.00279: stdout chunk (state=3): >>><<< 42613 1727204614.00393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204614.00398: _low_level_execute_command(): starting 42613 1727204614.00400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/AnsiballZ_service_facts.py && sleep 0' 42613 1727204614.01176: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204614.01196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204614.01316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204616.43485: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 42613 1727204616.45475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204616.45479: stdout chunk (state=3): >>><<< 42613 1727204616.45481: stderr chunk (state=3): >>><<< 42613 1727204616.45486: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204616.48017: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204616.48058: _low_level_execute_command(): starting 42613 1727204616.48076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204613.893809-45343-1692720177956/ > /dev/null 2>&1 && sleep 0' 42613 1727204616.49410: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204616.49415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204616.49418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204616.49420: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.49590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204616.49606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204616.49623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204616.51816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204616.51821: stdout chunk (state=3): >>><<< 42613 1727204616.51824: stderr chunk (state=3): >>><<< 42613 1727204616.51847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204616.51883: handler run complete 42613 1727204616.52922: variable 'ansible_facts' from source: unknown 42613 1727204616.53197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204616.54774: variable 'ansible_facts' from source: unknown 42613 1727204616.55358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204616.55896: attempt loop complete, returning result 42613 1727204616.55913: _execute() done 42613 1727204616.55927: dumping result to json 42613 1727204616.56011: done dumping result, returning 42613 1727204616.56033: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-2f91-05d8-0000000005e3] 42613 1727204616.56045: sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e3 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204616.57948: no more pending results, returning what we have 42613 1727204616.57952: results queue empty 42613 1727204616.57953: checking for any_errors_fatal 42613 1727204616.57959: done checking for any_errors_fatal 42613 1727204616.57960: checking for max_fail_percentage 42613 1727204616.57961: done checking for max_fail_percentage 42613 1727204616.57962: checking to see if all hosts have failed and the running result is not ok 42613 1727204616.57963: done checking to see if all hosts have failed 42613 1727204616.57964: getting the remaining hosts for this loop 42613 1727204616.57966: done getting the remaining hosts for this loop 42613 1727204616.57970: getting the next task for host managed-node3 42613 1727204616.57976: done getting next task for host managed-node3 42613 1727204616.57980: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 42613 1727204616.57983: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204616.57992: getting variables 42613 1727204616.57994: in VariableManager get_vars() 42613 1727204616.58026: Calling all_inventory to load vars for managed-node3 42613 1727204616.58029: Calling groups_inventory to load vars for managed-node3 42613 1727204616.58031: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204616.58041: Calling all_plugins_play to load vars for managed-node3 42613 1727204616.58044: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204616.58047: Calling groups_plugins_play to load vars for managed-node3 42613 1727204616.58776: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e3 42613 1727204616.58781: WORKER PROCESS EXITING 42613 1727204616.61390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204616.63683: done with get_vars() 42613 1727204616.63722: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:36 -0400 (0:00:02.789) 0:00:45.246 ***** 42613 1727204616.63824: entering _queue_task() for managed-node3/package_facts 42613 1727204616.64132: worker is 1 (out of 1 available) 42613 1727204616.64147: exiting _queue_task() for managed-node3/package_facts 42613 1727204616.64160: done queuing things up, now waiting for results queue to drain 42613 1727204616.64161: waiting for pending results... 42613 1727204616.64374: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 42613 1727204616.64484: in run() - task 127b8e07-fff9-2f91-05d8-0000000005e4 42613 1727204616.64497: variable 'ansible_search_path' from source: unknown 42613 1727204616.64500: variable 'ansible_search_path' from source: unknown 42613 1727204616.64533: calling self._execute() 42613 1727204616.64632: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204616.64870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204616.64875: variable 'omit' from source: magic vars 42613 1727204616.65062: variable 'ansible_distribution_major_version' from source: facts 42613 1727204616.65081: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204616.65091: variable 'omit' from source: magic vars 42613 1727204616.65158: variable 'omit' from source: magic vars 42613 1727204616.65213: variable 'omit' from source: magic vars 42613 1727204616.65255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204616.65298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204616.65326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204616.65570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204616.65573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204616.65576: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204616.65578: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204616.65580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204616.65583: Set connection var ansible_shell_executable to /bin/sh 42613 1727204616.65585: Set connection var ansible_pipelining to False 42613 1727204616.65587: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204616.65589: Set connection var ansible_connection to ssh 42613 1727204616.65591: Set connection var ansible_timeout to 10 42613 1727204616.65593: Set connection var ansible_shell_type to sh 42613 1727204616.65595: variable 'ansible_shell_executable' from source: unknown 42613 1727204616.65629: variable 'ansible_connection' from source: unknown 42613 1727204616.65641: variable 'ansible_module_compression' from source: unknown 42613 1727204616.65651: variable 'ansible_shell_type' from source: unknown 42613 1727204616.65658: variable 'ansible_shell_executable' from source: unknown 42613 1727204616.65668: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204616.65686: variable 'ansible_pipelining' from source: unknown 42613 1727204616.65692: variable 'ansible_timeout' from source: unknown 42613 1727204616.65699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204616.65952: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204616.65972: variable 'omit' from source: magic vars 42613 1727204616.65983: starting attempt loop 42613 1727204616.65991: running the handler 42613 1727204616.66011: _low_level_execute_command(): starting 42613 1727204616.66023: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204616.66815: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204616.66890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204616.66918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.67023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204616.67030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204616.67111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204616.68949: stdout chunk (state=3): >>>/root <<< 42613 1727204616.69288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204616.69291: stdout chunk (state=3): >>><<< 42613 1727204616.69294: stderr chunk (state=3): >>><<< 42613 1727204616.69325: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204616.69443: _low_level_execute_command(): starting 42613 1727204616.69447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997 `" && echo ansible-tmp-1727204616.6933725-45647-19211717906997="` echo /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997 `" ) && sleep 0' 42613 1727204616.70153: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204616.70157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.70160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204616.70173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.70228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204616.70237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204616.70332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204616.72499: stdout chunk (state=3): >>>ansible-tmp-1727204616.6933725-45647-19211717906997=/root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997 <<< 42613 1727204616.72652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204616.72674: stderr chunk (state=3): >>><<< 42613 1727204616.72678: stdout chunk (state=3): >>><<< 42613 1727204616.72698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204616.6933725-45647-19211717906997=/root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204616.72746: variable 'ansible_module_compression' from source: unknown 42613 1727204616.72790: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 42613 1727204616.72853: variable 'ansible_facts' from source: unknown 42613 1727204616.72977: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/AnsiballZ_package_facts.py 42613 1727204616.73310: Sending initial data 42613 1727204616.73314: Sent initial data (161 bytes) 42613 1727204616.73874: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.73977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204616.73986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.74076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204616.74091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204616.74108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204616.74191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204616.75984: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204616.76048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204616.76114: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpojy5xg7f /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/AnsiballZ_package_facts.py <<< 42613 1727204616.76120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/AnsiballZ_package_facts.py" <<< 42613 1727204616.76176: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpojy5xg7f" to remote "/root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/AnsiballZ_package_facts.py" <<< 42613 1727204616.76182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/AnsiballZ_package_facts.py" <<< 42613 1727204616.77420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204616.77504: stderr chunk (state=3): >>><<< 42613 1727204616.77508: stdout chunk (state=3): >>><<< 42613 1727204616.77527: done transferring module to remote 42613 1727204616.77540: _low_level_execute_command(): starting 42613 1727204616.77549: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/ /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/AnsiballZ_package_facts.py && sleep 0' 42613 1727204616.78048: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204616.78053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.78055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204616.78058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.78114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204616.78127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204616.78188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204616.89201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204616.89206: stdout chunk (state=3): >>><<< 42613 1727204616.89208: stderr chunk (state=3): >>><<< 42613 1727204616.89228: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204616.89326: _low_level_execute_command(): starting 42613 1727204616.89330: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/AnsiballZ_package_facts.py && sleep 0' 42613 1727204616.90107: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204616.90127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204616.90144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204616.90164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204616.90188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204616.90205: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204616.90219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.90317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204616.90330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204616.90350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204616.90375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204616.90560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204617.54769: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 42613 1727204617.54827: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 42613 1727204617.54879: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 42613 1727204617.54929: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 42613 1727204617.54945: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 42613 1727204617.54949: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 42613 1727204617.54975: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 42613 1727204617.55001: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 42613 1727204617.57043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204617.57111: stderr chunk (state=3): >>><<< 42613 1727204617.57115: stdout chunk (state=3): >>><<< 42613 1727204617.57176: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204617.58927: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204617.58950: _low_level_execute_command(): starting 42613 1727204617.58954: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204616.6933725-45647-19211717906997/ > /dev/null 2>&1 && sleep 0' 42613 1727204617.59481: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204617.59486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204617.59490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204617.59492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204617.59549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204617.59552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204617.59555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204617.59637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204617.61701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204617.61763: stderr chunk (state=3): >>><<< 42613 1727204617.61769: stdout chunk (state=3): >>><<< 42613 1727204617.61784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204617.61790: handler run complete 42613 1727204617.62423: variable 'ansible_facts' from source: unknown 42613 1727204617.62793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204617.64360: variable 'ansible_facts' from source: unknown 42613 1727204617.64703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204617.65274: attempt loop complete, returning result 42613 1727204617.65288: _execute() done 42613 1727204617.65291: dumping result to json 42613 1727204617.65452: done dumping result, returning 42613 1727204617.65461: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-2f91-05d8-0000000005e4] 42613 1727204617.65468: sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e4 42613 1727204617.67339: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000005e4 42613 1727204617.67344: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204617.67439: no more pending results, returning what we have 42613 1727204617.67443: results queue empty 42613 1727204617.67444: checking for any_errors_fatal 42613 1727204617.67449: done checking for any_errors_fatal 42613 1727204617.67450: checking for max_fail_percentage 42613 1727204617.67451: done checking for max_fail_percentage 42613 1727204617.67452: checking to see if all hosts have failed and the running result is not ok 42613 1727204617.67452: done checking to see if all hosts have failed 42613 1727204617.67453: getting the remaining hosts for this loop 42613 1727204617.67454: done getting the remaining hosts for this loop 42613 1727204617.67457: getting the next task for host managed-node3 42613 1727204617.67462: done getting next task for host managed-node3 42613 1727204617.67468: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 42613 1727204617.67470: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204617.67476: getting variables 42613 1727204617.67477: in VariableManager get_vars() 42613 1727204617.67501: Calling all_inventory to load vars for managed-node3 42613 1727204617.67503: Calling groups_inventory to load vars for managed-node3 42613 1727204617.67505: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204617.67512: Calling all_plugins_play to load vars for managed-node3 42613 1727204617.67514: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204617.67516: Calling groups_plugins_play to load vars for managed-node3 42613 1727204617.68488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204617.69735: done with get_vars() 42613 1727204617.69769: done getting variables 42613 1727204617.69821: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:37 -0400 (0:00:01.060) 0:00:46.306 ***** 42613 1727204617.69849: entering _queue_task() for managed-node3/debug 42613 1727204617.70142: worker is 1 (out of 1 available) 42613 1727204617.70157: exiting _queue_task() for managed-node3/debug 42613 1727204617.70172: done queuing things up, now waiting for results queue to drain 42613 1727204617.70173: waiting for pending results... 42613 1727204617.70373: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 42613 1727204617.70459: in run() - task 127b8e07-fff9-2f91-05d8-000000000092 42613 1727204617.70474: variable 'ansible_search_path' from source: unknown 42613 1727204617.70478: variable 'ansible_search_path' from source: unknown 42613 1727204617.70512: calling self._execute() 42613 1727204617.70595: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204617.70600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204617.70608: variable 'omit' from source: magic vars 42613 1727204617.70929: variable 'ansible_distribution_major_version' from source: facts 42613 1727204617.70942: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204617.70954: variable 'omit' from source: magic vars 42613 1727204617.70987: variable 'omit' from source: magic vars 42613 1727204617.71067: variable 'network_provider' from source: set_fact 42613 1727204617.71085: variable 'omit' from source: magic vars 42613 1727204617.71124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204617.71156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204617.71179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204617.71194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204617.71204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204617.71231: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204617.71234: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204617.71238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204617.71325: Set connection var ansible_shell_executable to /bin/sh 42613 1727204617.71328: Set connection var ansible_pipelining to False 42613 1727204617.71336: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204617.71339: Set connection var ansible_connection to ssh 42613 1727204617.71347: Set connection var ansible_timeout to 10 42613 1727204617.71349: Set connection var ansible_shell_type to sh 42613 1727204617.71369: variable 'ansible_shell_executable' from source: unknown 42613 1727204617.71373: variable 'ansible_connection' from source: unknown 42613 1727204617.71377: variable 'ansible_module_compression' from source: unknown 42613 1727204617.71380: variable 'ansible_shell_type' from source: unknown 42613 1727204617.71382: variable 'ansible_shell_executable' from source: unknown 42613 1727204617.71385: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204617.71389: variable 'ansible_pipelining' from source: unknown 42613 1727204617.71391: variable 'ansible_timeout' from source: unknown 42613 1727204617.71393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204617.71515: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204617.71519: variable 'omit' from source: magic vars 42613 1727204617.71522: starting attempt loop 42613 1727204617.71525: running the handler 42613 1727204617.71570: handler run complete 42613 1727204617.71583: attempt loop complete, returning result 42613 1727204617.71587: _execute() done 42613 1727204617.71589: dumping result to json 42613 1727204617.71592: done dumping result, returning 42613 1727204617.71600: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-2f91-05d8-000000000092] 42613 1727204617.71606: sending task result for task 127b8e07-fff9-2f91-05d8-000000000092 42613 1727204617.71700: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000092 42613 1727204617.71703: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 42613 1727204617.71790: no more pending results, returning what we have 42613 1727204617.71794: results queue empty 42613 1727204617.71795: checking for any_errors_fatal 42613 1727204617.71804: done checking for any_errors_fatal 42613 1727204617.71805: checking for max_fail_percentage 42613 1727204617.71807: done checking for max_fail_percentage 42613 1727204617.71808: checking to see if all hosts have failed and the running result is not ok 42613 1727204617.71809: done checking to see if all hosts have failed 42613 1727204617.71810: getting the remaining hosts for this loop 42613 1727204617.71811: done getting the remaining hosts for this loop 42613 1727204617.71815: getting the next task for host managed-node3 42613 1727204617.71822: done getting next task for host managed-node3 42613 1727204617.71826: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 42613 1727204617.71828: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204617.71840: getting variables 42613 1727204617.71841: in VariableManager get_vars() 42613 1727204617.71881: Calling all_inventory to load vars for managed-node3 42613 1727204617.71883: Calling groups_inventory to load vars for managed-node3 42613 1727204617.71885: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204617.71896: Calling all_plugins_play to load vars for managed-node3 42613 1727204617.71898: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204617.71901: Calling groups_plugins_play to load vars for managed-node3 42613 1727204617.72960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204617.78572: done with get_vars() 42613 1727204617.78602: done getting variables 42613 1727204617.78647: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.088) 0:00:46.394 ***** 42613 1727204617.78670: entering _queue_task() for managed-node3/fail 42613 1727204617.78969: worker is 1 (out of 1 available) 42613 1727204617.78985: exiting _queue_task() for managed-node3/fail 42613 1727204617.78997: done queuing things up, now waiting for results queue to drain 42613 1727204617.78998: waiting for pending results... 42613 1727204617.79202: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 42613 1727204617.79303: in run() - task 127b8e07-fff9-2f91-05d8-000000000093 42613 1727204617.79316: variable 'ansible_search_path' from source: unknown 42613 1727204617.79320: variable 'ansible_search_path' from source: unknown 42613 1727204617.79356: calling self._execute() 42613 1727204617.79446: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204617.79452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204617.79463: variable 'omit' from source: magic vars 42613 1727204617.79790: variable 'ansible_distribution_major_version' from source: facts 42613 1727204617.79801: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204617.79897: variable 'network_state' from source: role '' defaults 42613 1727204617.79910: Evaluated conditional (network_state != {}): False 42613 1727204617.79914: when evaluation is False, skipping this task 42613 1727204617.79917: _execute() done 42613 1727204617.79920: dumping result to json 42613 1727204617.79922: done dumping result, returning 42613 1727204617.79929: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-2f91-05d8-000000000093] 42613 1727204617.79934: sending task result for task 127b8e07-fff9-2f91-05d8-000000000093 42613 1727204617.80044: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000093 42613 1727204617.80047: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204617.80100: no more pending results, returning what we have 42613 1727204617.80104: results queue empty 42613 1727204617.80105: checking for any_errors_fatal 42613 1727204617.80116: done checking for any_errors_fatal 42613 1727204617.80117: checking for max_fail_percentage 42613 1727204617.80119: done checking for max_fail_percentage 42613 1727204617.80120: checking to see if all hosts have failed and the running result is not ok 42613 1727204617.80121: done checking to see if all hosts have failed 42613 1727204617.80121: getting the remaining hosts for this loop 42613 1727204617.80123: done getting the remaining hosts for this loop 42613 1727204617.80127: getting the next task for host managed-node3 42613 1727204617.80135: done getting next task for host managed-node3 42613 1727204617.80139: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 42613 1727204617.80142: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204617.80158: getting variables 42613 1727204617.80160: in VariableManager get_vars() 42613 1727204617.80200: Calling all_inventory to load vars for managed-node3 42613 1727204617.80203: Calling groups_inventory to load vars for managed-node3 42613 1727204617.80205: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204617.80216: Calling all_plugins_play to load vars for managed-node3 42613 1727204617.80219: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204617.80221: Calling groups_plugins_play to load vars for managed-node3 42613 1727204617.81273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204617.82520: done with get_vars() 42613 1727204617.82552: done getting variables 42613 1727204617.82604: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.039) 0:00:46.434 ***** 42613 1727204617.82629: entering _queue_task() for managed-node3/fail 42613 1727204617.82941: worker is 1 (out of 1 available) 42613 1727204617.82957: exiting _queue_task() for managed-node3/fail 42613 1727204617.82973: done queuing things up, now waiting for results queue to drain 42613 1727204617.82975: waiting for pending results... 42613 1727204617.83192: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 42613 1727204617.83281: in run() - task 127b8e07-fff9-2f91-05d8-000000000094 42613 1727204617.83294: variable 'ansible_search_path' from source: unknown 42613 1727204617.83298: variable 'ansible_search_path' from source: unknown 42613 1727204617.83333: calling self._execute() 42613 1727204617.83421: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204617.83428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204617.83438: variable 'omit' from source: magic vars 42613 1727204617.83767: variable 'ansible_distribution_major_version' from source: facts 42613 1727204617.83772: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204617.83862: variable 'network_state' from source: role '' defaults 42613 1727204617.83877: Evaluated conditional (network_state != {}): False 42613 1727204617.83881: when evaluation is False, skipping this task 42613 1727204617.83884: _execute() done 42613 1727204617.83887: dumping result to json 42613 1727204617.83890: done dumping result, returning 42613 1727204617.83897: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-2f91-05d8-000000000094] 42613 1727204617.83900: sending task result for task 127b8e07-fff9-2f91-05d8-000000000094 42613 1727204617.84002: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000094 42613 1727204617.84005: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204617.84055: no more pending results, returning what we have 42613 1727204617.84058: results queue empty 42613 1727204617.84059: checking for any_errors_fatal 42613 1727204617.84069: done checking for any_errors_fatal 42613 1727204617.84070: checking for max_fail_percentage 42613 1727204617.84072: done checking for max_fail_percentage 42613 1727204617.84073: checking to see if all hosts have failed and the running result is not ok 42613 1727204617.84074: done checking to see if all hosts have failed 42613 1727204617.84075: getting the remaining hosts for this loop 42613 1727204617.84077: done getting the remaining hosts for this loop 42613 1727204617.84081: getting the next task for host managed-node3 42613 1727204617.84089: done getting next task for host managed-node3 42613 1727204617.84093: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 42613 1727204617.84095: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204617.84112: getting variables 42613 1727204617.84113: in VariableManager get_vars() 42613 1727204617.84154: Calling all_inventory to load vars for managed-node3 42613 1727204617.84158: Calling groups_inventory to load vars for managed-node3 42613 1727204617.84159: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204617.84177: Calling all_plugins_play to load vars for managed-node3 42613 1727204617.84180: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204617.84183: Calling groups_plugins_play to load vars for managed-node3 42613 1727204617.85925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204617.87189: done with get_vars() 42613 1727204617.87223: done getting variables 42613 1727204617.87277: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.046) 0:00:46.481 ***** 42613 1727204617.87301: entering _queue_task() for managed-node3/fail 42613 1727204617.87596: worker is 1 (out of 1 available) 42613 1727204617.87611: exiting _queue_task() for managed-node3/fail 42613 1727204617.87626: done queuing things up, now waiting for results queue to drain 42613 1727204617.87628: waiting for pending results... 42613 1727204617.88023: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 42613 1727204617.88273: in run() - task 127b8e07-fff9-2f91-05d8-000000000095 42613 1727204617.88278: variable 'ansible_search_path' from source: unknown 42613 1727204617.88281: variable 'ansible_search_path' from source: unknown 42613 1727204617.88284: calling self._execute() 42613 1727204617.88502: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204617.88514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204617.88527: variable 'omit' from source: magic vars 42613 1727204617.88958: variable 'ansible_distribution_major_version' from source: facts 42613 1727204617.88981: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204617.89180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204617.91754: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204617.91851: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204617.91901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204617.91946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204617.91980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204617.92086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204617.92123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204617.92157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204617.92207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204617.92227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204617.92343: variable 'ansible_distribution_major_version' from source: facts 42613 1727204617.92470: Evaluated conditional (ansible_distribution_major_version | int > 9): True 42613 1727204617.92508: variable 'ansible_distribution' from source: facts 42613 1727204617.92517: variable '__network_rh_distros' from source: role '' defaults 42613 1727204617.92532: Evaluated conditional (ansible_distribution in __network_rh_distros): False 42613 1727204617.92542: when evaluation is False, skipping this task 42613 1727204617.92550: _execute() done 42613 1727204617.92556: dumping result to json 42613 1727204617.92564: done dumping result, returning 42613 1727204617.92581: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-2f91-05d8-000000000095] 42613 1727204617.92591: sending task result for task 127b8e07-fff9-2f91-05d8-000000000095 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 42613 1727204617.92754: no more pending results, returning what we have 42613 1727204617.92757: results queue empty 42613 1727204617.92759: checking for any_errors_fatal 42613 1727204617.92767: done checking for any_errors_fatal 42613 1727204617.92768: checking for max_fail_percentage 42613 1727204617.92771: done checking for max_fail_percentage 42613 1727204617.92772: checking to see if all hosts have failed and the running result is not ok 42613 1727204617.92773: done checking to see if all hosts have failed 42613 1727204617.92773: getting the remaining hosts for this loop 42613 1727204617.92774: done getting the remaining hosts for this loop 42613 1727204617.92780: getting the next task for host managed-node3 42613 1727204617.92791: done getting next task for host managed-node3 42613 1727204617.92795: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 42613 1727204617.92797: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204617.92816: getting variables 42613 1727204617.92818: in VariableManager get_vars() 42613 1727204617.92860: Calling all_inventory to load vars for managed-node3 42613 1727204617.92862: Calling groups_inventory to load vars for managed-node3 42613 1727204617.92864: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204617.93019: Calling all_plugins_play to load vars for managed-node3 42613 1727204617.93022: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204617.93026: Calling groups_plugins_play to load vars for managed-node3 42613 1727204617.93550: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000095 42613 1727204617.93554: WORKER PROCESS EXITING 42613 1727204617.96427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.00392: done with get_vars() 42613 1727204618.00428: done getting variables 42613 1727204618.00500: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.132) 0:00:46.613 ***** 42613 1727204618.00534: entering _queue_task() for managed-node3/dnf 42613 1727204618.00930: worker is 1 (out of 1 available) 42613 1727204618.00947: exiting _queue_task() for managed-node3/dnf 42613 1727204618.00962: done queuing things up, now waiting for results queue to drain 42613 1727204618.00963: waiting for pending results... 42613 1727204618.01323: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 42613 1727204618.01527: in run() - task 127b8e07-fff9-2f91-05d8-000000000096 42613 1727204618.01531: variable 'ansible_search_path' from source: unknown 42613 1727204618.01535: variable 'ansible_search_path' from source: unknown 42613 1727204618.01538: calling self._execute() 42613 1727204618.01614: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.01633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.01648: variable 'omit' from source: magic vars 42613 1727204618.02082: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.02105: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.02328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204618.05777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204618.05881: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204618.05926: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204618.05978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204618.06010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204618.06113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.06172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.06191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.06242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.06283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.06422: variable 'ansible_distribution' from source: facts 42613 1727204618.06500: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.06504: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 42613 1727204618.06594: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204618.06770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.06803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.06843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.06893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.06913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.06973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.07002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.07046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.07088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.07155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.07162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.07193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.07223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.07277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.07297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.07494: variable 'network_connections' from source: play vars 42613 1727204618.07512: variable 'profile' from source: play vars 42613 1727204618.07610: variable 'profile' from source: play vars 42613 1727204618.07620: variable 'interface' from source: set_fact 42613 1727204618.07770: variable 'interface' from source: set_fact 42613 1727204618.07794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204618.08013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204618.08071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204618.08110: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204618.08152: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204618.08207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204618.08234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204618.08360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.08363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204618.08375: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204618.08669: variable 'network_connections' from source: play vars 42613 1727204618.08681: variable 'profile' from source: play vars 42613 1727204618.08763: variable 'profile' from source: play vars 42613 1727204618.08777: variable 'interface' from source: set_fact 42613 1727204618.08852: variable 'interface' from source: set_fact 42613 1727204618.08886: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204618.08894: when evaluation is False, skipping this task 42613 1727204618.08913: _execute() done 42613 1727204618.08916: dumping result to json 42613 1727204618.08919: done dumping result, returning 42613 1727204618.08970: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-000000000096] 42613 1727204618.08973: sending task result for task 127b8e07-fff9-2f91-05d8-000000000096 42613 1727204618.09311: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000096 42613 1727204618.09314: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204618.09370: no more pending results, returning what we have 42613 1727204618.09374: results queue empty 42613 1727204618.09375: checking for any_errors_fatal 42613 1727204618.09380: done checking for any_errors_fatal 42613 1727204618.09381: checking for max_fail_percentage 42613 1727204618.09383: done checking for max_fail_percentage 42613 1727204618.09384: checking to see if all hosts have failed and the running result is not ok 42613 1727204618.09385: done checking to see if all hosts have failed 42613 1727204618.09386: getting the remaining hosts for this loop 42613 1727204618.09387: done getting the remaining hosts for this loop 42613 1727204618.09391: getting the next task for host managed-node3 42613 1727204618.09398: done getting next task for host managed-node3 42613 1727204618.09402: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 42613 1727204618.09407: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204618.09420: getting variables 42613 1727204618.09422: in VariableManager get_vars() 42613 1727204618.09460: Calling all_inventory to load vars for managed-node3 42613 1727204618.09463: Calling groups_inventory to load vars for managed-node3 42613 1727204618.09465: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204618.09476: Calling all_plugins_play to load vars for managed-node3 42613 1727204618.09479: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204618.09481: Calling groups_plugins_play to load vars for managed-node3 42613 1727204618.11712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.13981: done with get_vars() 42613 1727204618.14026: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 42613 1727204618.14118: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.136) 0:00:46.749 ***** 42613 1727204618.14154: entering _queue_task() for managed-node3/yum 42613 1727204618.14560: worker is 1 (out of 1 available) 42613 1727204618.14680: exiting _queue_task() for managed-node3/yum 42613 1727204618.14690: done queuing things up, now waiting for results queue to drain 42613 1727204618.14691: waiting for pending results... 42613 1727204618.14925: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 42613 1727204618.15060: in run() - task 127b8e07-fff9-2f91-05d8-000000000097 42613 1727204618.15084: variable 'ansible_search_path' from source: unknown 42613 1727204618.15092: variable 'ansible_search_path' from source: unknown 42613 1727204618.15135: calling self._execute() 42613 1727204618.15262: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.15276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.15290: variable 'omit' from source: magic vars 42613 1727204618.15729: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.15752: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.15963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204618.18548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204618.19083: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204618.19135: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204618.19179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204618.19214: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204618.19315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.19357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.19393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.19454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.19478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.19599: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.19635: Evaluated conditional (ansible_distribution_major_version | int < 8): False 42613 1727204618.19641: when evaluation is False, skipping this task 42613 1727204618.19644: _execute() done 42613 1727204618.19646: dumping result to json 42613 1727204618.19671: done dumping result, returning 42613 1727204618.19675: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-000000000097] 42613 1727204618.19678: sending task result for task 127b8e07-fff9-2f91-05d8-000000000097 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 42613 1727204618.19984: no more pending results, returning what we have 42613 1727204618.19988: results queue empty 42613 1727204618.19989: checking for any_errors_fatal 42613 1727204618.19995: done checking for any_errors_fatal 42613 1727204618.19996: checking for max_fail_percentage 42613 1727204618.20000: done checking for max_fail_percentage 42613 1727204618.20001: checking to see if all hosts have failed and the running result is not ok 42613 1727204618.20002: done checking to see if all hosts have failed 42613 1727204618.20003: getting the remaining hosts for this loop 42613 1727204618.20004: done getting the remaining hosts for this loop 42613 1727204618.20009: getting the next task for host managed-node3 42613 1727204618.20016: done getting next task for host managed-node3 42613 1727204618.20021: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 42613 1727204618.20023: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204618.20038: getting variables 42613 1727204618.20043: in VariableManager get_vars() 42613 1727204618.20085: Calling all_inventory to load vars for managed-node3 42613 1727204618.20088: Calling groups_inventory to load vars for managed-node3 42613 1727204618.20091: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204618.20102: Calling all_plugins_play to load vars for managed-node3 42613 1727204618.20105: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204618.20108: Calling groups_plugins_play to load vars for managed-node3 42613 1727204618.20751: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000097 42613 1727204618.20756: WORKER PROCESS EXITING 42613 1727204618.22533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.24846: done with get_vars() 42613 1727204618.24894: done getting variables 42613 1727204618.24967: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.108) 0:00:46.858 ***** 42613 1727204618.25001: entering _queue_task() for managed-node3/fail 42613 1727204618.25590: worker is 1 (out of 1 available) 42613 1727204618.25602: exiting _queue_task() for managed-node3/fail 42613 1727204618.25613: done queuing things up, now waiting for results queue to drain 42613 1727204618.25614: waiting for pending results... 42613 1727204618.25769: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 42613 1727204618.25904: in run() - task 127b8e07-fff9-2f91-05d8-000000000098 42613 1727204618.25924: variable 'ansible_search_path' from source: unknown 42613 1727204618.25931: variable 'ansible_search_path' from source: unknown 42613 1727204618.25982: calling self._execute() 42613 1727204618.26097: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.26110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.26126: variable 'omit' from source: magic vars 42613 1727204618.26773: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.26796: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.27072: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204618.27625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204618.31352: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204618.31416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204618.31450: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204618.31480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204618.31502: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204618.31575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.31597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.31619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.31652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.31664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.31705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.31779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.31782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.31785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.31787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.31815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.31836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.31855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.31886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.31896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.32031: variable 'network_connections' from source: play vars 42613 1727204618.32045: variable 'profile' from source: play vars 42613 1727204618.32105: variable 'profile' from source: play vars 42613 1727204618.32109: variable 'interface' from source: set_fact 42613 1727204618.32157: variable 'interface' from source: set_fact 42613 1727204618.32216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204618.32349: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204618.32384: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204618.32408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204618.32433: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204618.32470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204618.32489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204618.32508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.32526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204618.32570: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204618.32764: variable 'network_connections' from source: play vars 42613 1727204618.32770: variable 'profile' from source: play vars 42613 1727204618.32865: variable 'profile' from source: play vars 42613 1727204618.32874: variable 'interface' from source: set_fact 42613 1727204618.33288: variable 'interface' from source: set_fact 42613 1727204618.33292: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204618.33295: when evaluation is False, skipping this task 42613 1727204618.33298: _execute() done 42613 1727204618.33300: dumping result to json 42613 1727204618.33302: done dumping result, returning 42613 1727204618.33305: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-000000000098] 42613 1727204618.33317: sending task result for task 127b8e07-fff9-2f91-05d8-000000000098 42613 1727204618.33411: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000098 42613 1727204618.33415: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204618.33478: no more pending results, returning what we have 42613 1727204618.33482: results queue empty 42613 1727204618.33483: checking for any_errors_fatal 42613 1727204618.33489: done checking for any_errors_fatal 42613 1727204618.33489: checking for max_fail_percentage 42613 1727204618.33492: done checking for max_fail_percentage 42613 1727204618.33493: checking to see if all hosts have failed and the running result is not ok 42613 1727204618.33494: done checking to see if all hosts have failed 42613 1727204618.33495: getting the remaining hosts for this loop 42613 1727204618.33497: done getting the remaining hosts for this loop 42613 1727204618.33501: getting the next task for host managed-node3 42613 1727204618.33508: done getting next task for host managed-node3 42613 1727204618.33512: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 42613 1727204618.33514: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204618.33530: getting variables 42613 1727204618.33532: in VariableManager get_vars() 42613 1727204618.33581: Calling all_inventory to load vars for managed-node3 42613 1727204618.33584: Calling groups_inventory to load vars for managed-node3 42613 1727204618.33586: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204618.33597: Calling all_plugins_play to load vars for managed-node3 42613 1727204618.33600: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204618.33603: Calling groups_plugins_play to load vars for managed-node3 42613 1727204618.35435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.36969: done with get_vars() 42613 1727204618.37008: done getting variables 42613 1727204618.37082: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.121) 0:00:46.979 ***** 42613 1727204618.37114: entering _queue_task() for managed-node3/package 42613 1727204618.37507: worker is 1 (out of 1 available) 42613 1727204618.37523: exiting _queue_task() for managed-node3/package 42613 1727204618.37538: done queuing things up, now waiting for results queue to drain 42613 1727204618.37543: waiting for pending results... 42613 1727204618.38289: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 42613 1727204618.38480: in run() - task 127b8e07-fff9-2f91-05d8-000000000099 42613 1727204618.38487: variable 'ansible_search_path' from source: unknown 42613 1727204618.38490: variable 'ansible_search_path' from source: unknown 42613 1727204618.38618: calling self._execute() 42613 1727204618.38828: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.38832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.38836: variable 'omit' from source: magic vars 42613 1727204618.39283: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.39287: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.39456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204618.39672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204618.39712: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204618.39739: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204618.39810: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204618.39909: variable 'network_packages' from source: role '' defaults 42613 1727204618.39996: variable '__network_provider_setup' from source: role '' defaults 42613 1727204618.40013: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204618.40070: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204618.40078: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204618.40128: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204618.40263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204618.43004: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204618.43037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204618.43089: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204618.43219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204618.43225: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204618.43357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.43427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.43482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.43531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.43653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.43657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.43687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.43718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.43827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.43848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.44153: variable '__network_packages_default_gobject_packages' from source: role '' defaults 42613 1727204618.44247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.44267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.44285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.44322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.44334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.44408: variable 'ansible_python' from source: facts 42613 1727204618.44436: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 42613 1727204618.44504: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204618.44572: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204618.44669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.44689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.44707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.44735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.44751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.44793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.44814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.44833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.44866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.44879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.44990: variable 'network_connections' from source: play vars 42613 1727204618.44997: variable 'profile' from source: play vars 42613 1727204618.45074: variable 'profile' from source: play vars 42613 1727204618.45080: variable 'interface' from source: set_fact 42613 1727204618.45134: variable 'interface' from source: set_fact 42613 1727204618.45198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204618.45221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204618.45244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.45269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204618.45310: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204618.45521: variable 'network_connections' from source: play vars 42613 1727204618.45524: variable 'profile' from source: play vars 42613 1727204618.45849: variable 'profile' from source: play vars 42613 1727204618.45854: variable 'interface' from source: set_fact 42613 1727204618.45856: variable 'interface' from source: set_fact 42613 1727204618.45859: variable '__network_packages_default_wireless' from source: role '' defaults 42613 1727204618.45861: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204618.46592: variable 'network_connections' from source: play vars 42613 1727204618.46596: variable 'profile' from source: play vars 42613 1727204618.46599: variable 'profile' from source: play vars 42613 1727204618.46601: variable 'interface' from source: set_fact 42613 1727204618.46694: variable 'interface' from source: set_fact 42613 1727204618.46715: variable '__network_packages_default_team' from source: role '' defaults 42613 1727204618.46802: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204618.47267: variable 'network_connections' from source: play vars 42613 1727204618.47276: variable 'profile' from source: play vars 42613 1727204618.47360: variable 'profile' from source: play vars 42613 1727204618.47363: variable 'interface' from source: set_fact 42613 1727204618.47480: variable 'interface' from source: set_fact 42613 1727204618.47556: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204618.47623: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204618.47630: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204618.47681: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204618.47835: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 42613 1727204618.48202: variable 'network_connections' from source: play vars 42613 1727204618.48208: variable 'profile' from source: play vars 42613 1727204618.48257: variable 'profile' from source: play vars 42613 1727204618.48260: variable 'interface' from source: set_fact 42613 1727204618.48309: variable 'interface' from source: set_fact 42613 1727204618.48318: variable 'ansible_distribution' from source: facts 42613 1727204618.48322: variable '__network_rh_distros' from source: role '' defaults 42613 1727204618.48329: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.48343: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 42613 1727204618.48468: variable 'ansible_distribution' from source: facts 42613 1727204618.48472: variable '__network_rh_distros' from source: role '' defaults 42613 1727204618.48477: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.48483: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 42613 1727204618.48603: variable 'ansible_distribution' from source: facts 42613 1727204618.48607: variable '__network_rh_distros' from source: role '' defaults 42613 1727204618.48610: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.48638: variable 'network_provider' from source: set_fact 42613 1727204618.48655: variable 'ansible_facts' from source: unknown 42613 1727204618.49234: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 42613 1727204618.49238: when evaluation is False, skipping this task 42613 1727204618.49241: _execute() done 42613 1727204618.49244: dumping result to json 42613 1727204618.49246: done dumping result, returning 42613 1727204618.49255: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-2f91-05d8-000000000099] 42613 1727204618.49260: sending task result for task 127b8e07-fff9-2f91-05d8-000000000099 42613 1727204618.49367: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000099 42613 1727204618.49370: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 42613 1727204618.49429: no more pending results, returning what we have 42613 1727204618.49432: results queue empty 42613 1727204618.49433: checking for any_errors_fatal 42613 1727204618.49442: done checking for any_errors_fatal 42613 1727204618.49442: checking for max_fail_percentage 42613 1727204618.49445: done checking for max_fail_percentage 42613 1727204618.49446: checking to see if all hosts have failed and the running result is not ok 42613 1727204618.49447: done checking to see if all hosts have failed 42613 1727204618.49447: getting the remaining hosts for this loop 42613 1727204618.49449: done getting the remaining hosts for this loop 42613 1727204618.49453: getting the next task for host managed-node3 42613 1727204618.49460: done getting next task for host managed-node3 42613 1727204618.49464: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 42613 1727204618.49468: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204618.49484: getting variables 42613 1727204618.49486: in VariableManager get_vars() 42613 1727204618.49524: Calling all_inventory to load vars for managed-node3 42613 1727204618.49527: Calling groups_inventory to load vars for managed-node3 42613 1727204618.49529: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204618.49545: Calling all_plugins_play to load vars for managed-node3 42613 1727204618.49548: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204618.49551: Calling groups_plugins_play to load vars for managed-node3 42613 1727204618.50630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.51880: done with get_vars() 42613 1727204618.51910: done getting variables 42613 1727204618.51962: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.148) 0:00:47.128 ***** 42613 1727204618.51995: entering _queue_task() for managed-node3/package 42613 1727204618.52291: worker is 1 (out of 1 available) 42613 1727204618.52306: exiting _queue_task() for managed-node3/package 42613 1727204618.52319: done queuing things up, now waiting for results queue to drain 42613 1727204618.52321: waiting for pending results... 42613 1727204618.52546: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 42613 1727204618.52637: in run() - task 127b8e07-fff9-2f91-05d8-00000000009a 42613 1727204618.52651: variable 'ansible_search_path' from source: unknown 42613 1727204618.52655: variable 'ansible_search_path' from source: unknown 42613 1727204618.52693: calling self._execute() 42613 1727204618.52783: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.52788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.52796: variable 'omit' from source: magic vars 42613 1727204618.53107: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.53117: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.53216: variable 'network_state' from source: role '' defaults 42613 1727204618.53221: Evaluated conditional (network_state != {}): False 42613 1727204618.53225: when evaluation is False, skipping this task 42613 1727204618.53228: _execute() done 42613 1727204618.53232: dumping result to json 42613 1727204618.53235: done dumping result, returning 42613 1727204618.53245: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-2f91-05d8-00000000009a] 42613 1727204618.53249: sending task result for task 127b8e07-fff9-2f91-05d8-00000000009a 42613 1727204618.53358: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000009a 42613 1727204618.53361: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204618.53418: no more pending results, returning what we have 42613 1727204618.53422: results queue empty 42613 1727204618.53423: checking for any_errors_fatal 42613 1727204618.53432: done checking for any_errors_fatal 42613 1727204618.53433: checking for max_fail_percentage 42613 1727204618.53436: done checking for max_fail_percentage 42613 1727204618.53437: checking to see if all hosts have failed and the running result is not ok 42613 1727204618.53438: done checking to see if all hosts have failed 42613 1727204618.53439: getting the remaining hosts for this loop 42613 1727204618.53443: done getting the remaining hosts for this loop 42613 1727204618.53448: getting the next task for host managed-node3 42613 1727204618.53455: done getting next task for host managed-node3 42613 1727204618.53460: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 42613 1727204618.53462: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204618.53486: getting variables 42613 1727204618.53488: in VariableManager get_vars() 42613 1727204618.53527: Calling all_inventory to load vars for managed-node3 42613 1727204618.53530: Calling groups_inventory to load vars for managed-node3 42613 1727204618.53532: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204618.53545: Calling all_plugins_play to load vars for managed-node3 42613 1727204618.53547: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204618.53550: Calling groups_plugins_play to load vars for managed-node3 42613 1727204618.54768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.55999: done with get_vars() 42613 1727204618.56032: done getting variables 42613 1727204618.56088: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.041) 0:00:47.169 ***** 42613 1727204618.56113: entering _queue_task() for managed-node3/package 42613 1727204618.56417: worker is 1 (out of 1 available) 42613 1727204618.56431: exiting _queue_task() for managed-node3/package 42613 1727204618.56446: done queuing things up, now waiting for results queue to drain 42613 1727204618.56448: waiting for pending results... 42613 1727204618.56654: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 42613 1727204618.56728: in run() - task 127b8e07-fff9-2f91-05d8-00000000009b 42613 1727204618.56743: variable 'ansible_search_path' from source: unknown 42613 1727204618.56748: variable 'ansible_search_path' from source: unknown 42613 1727204618.56783: calling self._execute() 42613 1727204618.56875: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.56880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.56892: variable 'omit' from source: magic vars 42613 1727204618.57215: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.57231: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.57332: variable 'network_state' from source: role '' defaults 42613 1727204618.57344: Evaluated conditional (network_state != {}): False 42613 1727204618.57351: when evaluation is False, skipping this task 42613 1727204618.57354: _execute() done 42613 1727204618.57357: dumping result to json 42613 1727204618.57359: done dumping result, returning 42613 1727204618.57362: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-2f91-05d8-00000000009b] 42613 1727204618.57367: sending task result for task 127b8e07-fff9-2f91-05d8-00000000009b 42613 1727204618.57480: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000009b 42613 1727204618.57484: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204618.57537: no more pending results, returning what we have 42613 1727204618.57543: results queue empty 42613 1727204618.57545: checking for any_errors_fatal 42613 1727204618.57556: done checking for any_errors_fatal 42613 1727204618.57557: checking for max_fail_percentage 42613 1727204618.57559: done checking for max_fail_percentage 42613 1727204618.57560: checking to see if all hosts have failed and the running result is not ok 42613 1727204618.57561: done checking to see if all hosts have failed 42613 1727204618.57562: getting the remaining hosts for this loop 42613 1727204618.57564: done getting the remaining hosts for this loop 42613 1727204618.57570: getting the next task for host managed-node3 42613 1727204618.57577: done getting next task for host managed-node3 42613 1727204618.57581: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 42613 1727204618.57583: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204618.57605: getting variables 42613 1727204618.57607: in VariableManager get_vars() 42613 1727204618.57650: Calling all_inventory to load vars for managed-node3 42613 1727204618.57652: Calling groups_inventory to load vars for managed-node3 42613 1727204618.57654: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204618.57668: Calling all_plugins_play to load vars for managed-node3 42613 1727204618.57671: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204618.57674: Calling groups_plugins_play to load vars for managed-node3 42613 1727204618.58737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.60176: done with get_vars() 42613 1727204618.60202: done getting variables 42613 1727204618.60256: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.041) 0:00:47.210 ***** 42613 1727204618.60285: entering _queue_task() for managed-node3/service 42613 1727204618.60582: worker is 1 (out of 1 available) 42613 1727204618.60597: exiting _queue_task() for managed-node3/service 42613 1727204618.60611: done queuing things up, now waiting for results queue to drain 42613 1727204618.60612: waiting for pending results... 42613 1727204618.60816: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 42613 1727204618.60904: in run() - task 127b8e07-fff9-2f91-05d8-00000000009c 42613 1727204618.60917: variable 'ansible_search_path' from source: unknown 42613 1727204618.60920: variable 'ansible_search_path' from source: unknown 42613 1727204618.60955: calling self._execute() 42613 1727204618.61042: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.61046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.61059: variable 'omit' from source: magic vars 42613 1727204618.61371: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.61381: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.61474: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204618.61633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204618.63348: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204618.63408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204618.63438: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204618.63472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204618.63492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204618.63559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.63584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.63604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.63633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.63646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.63690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.63707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.63725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.63754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.63766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.63802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.63819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.63836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.63864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.63878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.64010: variable 'network_connections' from source: play vars 42613 1727204618.64022: variable 'profile' from source: play vars 42613 1727204618.64085: variable 'profile' from source: play vars 42613 1727204618.64089: variable 'interface' from source: set_fact 42613 1727204618.64138: variable 'interface' from source: set_fact 42613 1727204618.64197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204618.64338: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204618.64372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204618.64398: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204618.64426: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204618.64464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204618.64482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204618.64500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.64518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204618.64563: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204618.64746: variable 'network_connections' from source: play vars 42613 1727204618.64751: variable 'profile' from source: play vars 42613 1727204618.64803: variable 'profile' from source: play vars 42613 1727204618.64807: variable 'interface' from source: set_fact 42613 1727204618.64852: variable 'interface' from source: set_fact 42613 1727204618.64876: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 42613 1727204618.64880: when evaluation is False, skipping this task 42613 1727204618.64882: _execute() done 42613 1727204618.64885: dumping result to json 42613 1727204618.64887: done dumping result, returning 42613 1727204618.64933: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-2f91-05d8-00000000009c] 42613 1727204618.64945: sending task result for task 127b8e07-fff9-2f91-05d8-00000000009c 42613 1727204618.65030: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000009c 42613 1727204618.65033: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 42613 1727204618.65082: no more pending results, returning what we have 42613 1727204618.65085: results queue empty 42613 1727204618.65086: checking for any_errors_fatal 42613 1727204618.65094: done checking for any_errors_fatal 42613 1727204618.65094: checking for max_fail_percentage 42613 1727204618.65096: done checking for max_fail_percentage 42613 1727204618.65097: checking to see if all hosts have failed and the running result is not ok 42613 1727204618.65098: done checking to see if all hosts have failed 42613 1727204618.65099: getting the remaining hosts for this loop 42613 1727204618.65100: done getting the remaining hosts for this loop 42613 1727204618.65104: getting the next task for host managed-node3 42613 1727204618.65112: done getting next task for host managed-node3 42613 1727204618.65116: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 42613 1727204618.65117: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204618.65132: getting variables 42613 1727204618.65134: in VariableManager get_vars() 42613 1727204618.65177: Calling all_inventory to load vars for managed-node3 42613 1727204618.65180: Calling groups_inventory to load vars for managed-node3 42613 1727204618.65182: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204618.65193: Calling all_plugins_play to load vars for managed-node3 42613 1727204618.65196: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204618.65199: Calling groups_plugins_play to load vars for managed-node3 42613 1727204618.66469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204618.68830: done with get_vars() 42613 1727204618.68887: done getting variables 42613 1727204618.68969: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.087) 0:00:47.298 ***** 42613 1727204618.69007: entering _queue_task() for managed-node3/service 42613 1727204618.69432: worker is 1 (out of 1 available) 42613 1727204618.69447: exiting _queue_task() for managed-node3/service 42613 1727204618.69461: done queuing things up, now waiting for results queue to drain 42613 1727204618.69462: waiting for pending results... 42613 1727204618.69799: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 42613 1727204618.70003: in run() - task 127b8e07-fff9-2f91-05d8-00000000009d 42613 1727204618.70007: variable 'ansible_search_path' from source: unknown 42613 1727204618.70010: variable 'ansible_search_path' from source: unknown 42613 1727204618.70018: calling self._execute() 42613 1727204618.70153: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.70168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.70185: variable 'omit' from source: magic vars 42613 1727204618.70616: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.70635: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204618.70872: variable 'network_provider' from source: set_fact 42613 1727204618.70876: variable 'network_state' from source: role '' defaults 42613 1727204618.70879: Evaluated conditional (network_provider == "nm" or network_state != {}): True 42613 1727204618.70882: variable 'omit' from source: magic vars 42613 1727204618.70903: variable 'omit' from source: magic vars 42613 1727204618.70943: variable 'network_service_name' from source: role '' defaults 42613 1727204618.71026: variable 'network_service_name' from source: role '' defaults 42613 1727204618.71150: variable '__network_provider_setup' from source: role '' defaults 42613 1727204618.71161: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204618.71233: variable '__network_service_name_default_nm' from source: role '' defaults 42613 1727204618.71253: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204618.71371: variable '__network_packages_default_nm' from source: role '' defaults 42613 1727204618.71585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204618.74041: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204618.74603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204618.74662: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204618.74716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204618.74753: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204618.74919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.74924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.74926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.74976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.74998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.75247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.75251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.75253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.75325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.75351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.75636: variable '__network_packages_default_gobject_packages' from source: role '' defaults 42613 1727204618.75790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.75824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.75858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.75912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.75931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.76049: variable 'ansible_python' from source: facts 42613 1727204618.76083: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 42613 1727204618.76189: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204618.76290: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204618.76470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.76484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.76514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.76575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.76666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.76673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204618.76698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204618.76727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.76781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204618.76807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204618.76991: variable 'network_connections' from source: play vars 42613 1727204618.77005: variable 'profile' from source: play vars 42613 1727204618.77107: variable 'profile' from source: play vars 42613 1727204618.77118: variable 'interface' from source: set_fact 42613 1727204618.77192: variable 'interface' from source: set_fact 42613 1727204618.77321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204618.77645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204618.77660: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204618.77717: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204618.77780: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204618.77869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204618.77909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204618.77953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204618.78002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204618.78074: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204618.78469: variable 'network_connections' from source: play vars 42613 1727204618.78486: variable 'profile' from source: play vars 42613 1727204618.78583: variable 'profile' from source: play vars 42613 1727204618.78732: variable 'interface' from source: set_fact 42613 1727204618.78736: variable 'interface' from source: set_fact 42613 1727204618.78738: variable '__network_packages_default_wireless' from source: role '' defaults 42613 1727204618.78899: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204618.79332: variable 'network_connections' from source: play vars 42613 1727204618.79348: variable 'profile' from source: play vars 42613 1727204618.79461: variable 'profile' from source: play vars 42613 1727204618.79476: variable 'interface' from source: set_fact 42613 1727204618.79587: variable 'interface' from source: set_fact 42613 1727204618.79620: variable '__network_packages_default_team' from source: role '' defaults 42613 1727204618.79720: variable '__network_team_connections_defined' from source: role '' defaults 42613 1727204618.80069: variable 'network_connections' from source: play vars 42613 1727204618.80146: variable 'profile' from source: play vars 42613 1727204618.80176: variable 'profile' from source: play vars 42613 1727204618.80187: variable 'interface' from source: set_fact 42613 1727204618.80274: variable 'interface' from source: set_fact 42613 1727204618.80349: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204618.80424: variable '__network_service_name_default_initscripts' from source: role '' defaults 42613 1727204618.80436: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204618.80511: variable '__network_packages_default_initscripts' from source: role '' defaults 42613 1727204618.80761: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 42613 1727204618.81315: variable 'network_connections' from source: play vars 42613 1727204618.81344: variable 'profile' from source: play vars 42613 1727204618.81405: variable 'profile' from source: play vars 42613 1727204618.81449: variable 'interface' from source: set_fact 42613 1727204618.81499: variable 'interface' from source: set_fact 42613 1727204618.81513: variable 'ansible_distribution' from source: facts 42613 1727204618.81521: variable '__network_rh_distros' from source: role '' defaults 42613 1727204618.81531: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.81557: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 42613 1727204618.81744: variable 'ansible_distribution' from source: facts 42613 1727204618.81754: variable '__network_rh_distros' from source: role '' defaults 42613 1727204618.81763: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.81779: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 42613 1727204618.81955: variable 'ansible_distribution' from source: facts 42613 1727204618.81964: variable '__network_rh_distros' from source: role '' defaults 42613 1727204618.81976: variable 'ansible_distribution_major_version' from source: facts 42613 1727204618.82019: variable 'network_provider' from source: set_fact 42613 1727204618.82051: variable 'omit' from source: magic vars 42613 1727204618.82098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204618.82172: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204618.82176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204618.82181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204618.82202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204618.82246: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204618.82255: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.82262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.82394: Set connection var ansible_shell_executable to /bin/sh 42613 1727204618.82406: Set connection var ansible_pipelining to False 42613 1727204618.82426: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204618.82428: Set connection var ansible_connection to ssh 42613 1727204618.82572: Set connection var ansible_timeout to 10 42613 1727204618.82575: Set connection var ansible_shell_type to sh 42613 1727204618.82578: variable 'ansible_shell_executable' from source: unknown 42613 1727204618.82580: variable 'ansible_connection' from source: unknown 42613 1727204618.82582: variable 'ansible_module_compression' from source: unknown 42613 1727204618.82585: variable 'ansible_shell_type' from source: unknown 42613 1727204618.82587: variable 'ansible_shell_executable' from source: unknown 42613 1727204618.82589: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204618.82596: variable 'ansible_pipelining' from source: unknown 42613 1727204618.82599: variable 'ansible_timeout' from source: unknown 42613 1727204618.82601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204618.82662: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204618.82681: variable 'omit' from source: magic vars 42613 1727204618.82691: starting attempt loop 42613 1727204618.82697: running the handler 42613 1727204618.82807: variable 'ansible_facts' from source: unknown 42613 1727204618.83894: _low_level_execute_command(): starting 42613 1727204618.83914: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204618.84775: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204618.84799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204618.84825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204618.84844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204618.84869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204618.85028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204618.86869: stdout chunk (state=3): >>>/root <<< 42613 1727204618.87321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204618.87325: stdout chunk (state=3): >>><<< 42613 1727204618.87327: stderr chunk (state=3): >>><<< 42613 1727204618.87330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204618.87332: _low_level_execute_command(): starting 42613 1727204618.87335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526 `" && echo ansible-tmp-1727204618.8721447-45793-252012163582526="` echo /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526 `" ) && sleep 0' 42613 1727204618.88455: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204618.88790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204618.88826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204618.88897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204618.91081: stdout chunk (state=3): >>>ansible-tmp-1727204618.8721447-45793-252012163582526=/root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526 <<< 42613 1727204618.91189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204618.91279: stderr chunk (state=3): >>><<< 42613 1727204618.91484: stdout chunk (state=3): >>><<< 42613 1727204618.91506: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204618.8721447-45793-252012163582526=/root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204618.91545: variable 'ansible_module_compression' from source: unknown 42613 1727204618.91603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 42613 1727204618.91879: variable 'ansible_facts' from source: unknown 42613 1727204618.92132: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/AnsiballZ_systemd.py 42613 1727204618.92407: Sending initial data 42613 1727204618.92410: Sent initial data (156 bytes) 42613 1727204618.93490: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204618.93516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204618.93623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204618.95511: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204618.95775: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204618.95891: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp_fzfm0oi /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/AnsiballZ_systemd.py <<< 42613 1727204618.95900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/AnsiballZ_systemd.py" <<< 42613 1727204618.95907: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp_fzfm0oi" to remote "/root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/AnsiballZ_systemd.py" <<< 42613 1727204618.98143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204618.98157: stdout chunk (state=3): >>><<< 42613 1727204618.98172: stderr chunk (state=3): >>><<< 42613 1727204618.98264: done transferring module to remote 42613 1727204618.98286: _low_level_execute_command(): starting 42613 1727204618.98423: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/ /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/AnsiballZ_systemd.py && sleep 0' 42613 1727204618.99889: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204618.99992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204619.00171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204619.02297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204619.02459: stderr chunk (state=3): >>><<< 42613 1727204619.02484: stdout chunk (state=3): >>><<< 42613 1727204619.02501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204619.02510: _low_level_execute_command(): starting 42613 1727204619.02559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/AnsiballZ_systemd.py && sleep 0' 42613 1727204619.03997: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204619.04001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204619.04401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204619.04869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204619.38007: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11837440", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3528794112", "CPUUsageNSec": "3269040000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 42613 1727204619.38043: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.service multi-user.target network.target", "After": "dbus-broker.service cloud-init-local.service systemd-journald.socket system.slice dbus.socket sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": <<< 42613 1727204619.38055: stdout chunk (state=3): >>>"system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:00:56 EDT", "StateChangeTimestampMonotonic": "794185509", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 42613 1727204619.40473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204619.40478: stdout chunk (state=3): >>><<< 42613 1727204619.40480: stderr chunk (state=3): >>><<< 42613 1727204619.40485: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11837440", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3528794112", "CPUUsageNSec": "3269040000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.service multi-user.target network.target", "After": "dbus-broker.service cloud-init-local.service systemd-journald.socket system.slice dbus.socket sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:00:56 EDT", "StateChangeTimestampMonotonic": "794185509", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204619.40868: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204619.40890: _low_level_execute_command(): starting 42613 1727204619.40893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204618.8721447-45793-252012163582526/ > /dev/null 2>&1 && sleep 0' 42613 1727204619.42256: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204619.42278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204619.42298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204619.42319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204619.42406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204619.42449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204619.42471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204619.42499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204619.42606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204619.44867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204619.44872: stdout chunk (state=3): >>><<< 42613 1727204619.44972: stderr chunk (state=3): >>><<< 42613 1727204619.44977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204619.44980: handler run complete 42613 1727204619.45026: attempt loop complete, returning result 42613 1727204619.45125: _execute() done 42613 1727204619.45128: dumping result to json 42613 1727204619.45269: done dumping result, returning 42613 1727204619.45390: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-2f91-05d8-00000000009d] 42613 1727204619.45395: sending task result for task 127b8e07-fff9-2f91-05d8-00000000009d ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204619.46278: no more pending results, returning what we have 42613 1727204619.46282: results queue empty 42613 1727204619.46283: checking for any_errors_fatal 42613 1727204619.46289: done checking for any_errors_fatal 42613 1727204619.46290: checking for max_fail_percentage 42613 1727204619.46292: done checking for max_fail_percentage 42613 1727204619.46293: checking to see if all hosts have failed and the running result is not ok 42613 1727204619.46294: done checking to see if all hosts have failed 42613 1727204619.46295: getting the remaining hosts for this loop 42613 1727204619.46296: done getting the remaining hosts for this loop 42613 1727204619.46299: getting the next task for host managed-node3 42613 1727204619.46305: done getting next task for host managed-node3 42613 1727204619.46309: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 42613 1727204619.46311: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204619.46320: getting variables 42613 1727204619.46322: in VariableManager get_vars() 42613 1727204619.46357: Calling all_inventory to load vars for managed-node3 42613 1727204619.46360: Calling groups_inventory to load vars for managed-node3 42613 1727204619.46362: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204619.46374: Calling all_plugins_play to load vars for managed-node3 42613 1727204619.46377: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204619.46380: Calling groups_plugins_play to load vars for managed-node3 42613 1727204619.47085: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000009d 42613 1727204619.47090: WORKER PROCESS EXITING 42613 1727204619.48701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204619.51451: done with get_vars() 42613 1727204619.51508: done getting variables 42613 1727204619.51583: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.826) 0:00:48.124 ***** 42613 1727204619.51623: entering _queue_task() for managed-node3/service 42613 1727204619.52287: worker is 1 (out of 1 available) 42613 1727204619.52300: exiting _queue_task() for managed-node3/service 42613 1727204619.52311: done queuing things up, now waiting for results queue to drain 42613 1727204619.52312: waiting for pending results... 42613 1727204619.52405: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 42613 1727204619.52551: in run() - task 127b8e07-fff9-2f91-05d8-00000000009e 42613 1727204619.52579: variable 'ansible_search_path' from source: unknown 42613 1727204619.52588: variable 'ansible_search_path' from source: unknown 42613 1727204619.52634: calling self._execute() 42613 1727204619.52768: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204619.52783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204619.52799: variable 'omit' from source: magic vars 42613 1727204619.53245: variable 'ansible_distribution_major_version' from source: facts 42613 1727204619.53268: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204619.53419: variable 'network_provider' from source: set_fact 42613 1727204619.53431: Evaluated conditional (network_provider == "nm"): True 42613 1727204619.53541: variable '__network_wpa_supplicant_required' from source: role '' defaults 42613 1727204619.53646: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 42613 1727204619.53855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204619.56497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204619.56603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204619.56654: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204619.56704: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204619.56736: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204619.56850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204619.56896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204619.56930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204619.56979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204619.57006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204619.57061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204619.57093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204619.57171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204619.57175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204619.57194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204619.57249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204619.57280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204619.57307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204619.57357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204619.57378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204619.57547: variable 'network_connections' from source: play vars 42613 1727204619.57656: variable 'profile' from source: play vars 42613 1727204619.57659: variable 'profile' from source: play vars 42613 1727204619.57661: variable 'interface' from source: set_fact 42613 1727204619.57731: variable 'interface' from source: set_fact 42613 1727204619.57825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 42613 1727204619.58038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 42613 1727204619.58096: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 42613 1727204619.58135: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 42613 1727204619.58174: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 42613 1727204619.58238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 42613 1727204619.58273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 42613 1727204619.58313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204619.58348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 42613 1727204619.58419: variable '__network_wireless_connections_defined' from source: role '' defaults 42613 1727204619.58759: variable 'network_connections' from source: play vars 42613 1727204619.58773: variable 'profile' from source: play vars 42613 1727204619.58863: variable 'profile' from source: play vars 42613 1727204619.58870: variable 'interface' from source: set_fact 42613 1727204619.58964: variable 'interface' from source: set_fact 42613 1727204619.58988: Evaluated conditional (__network_wpa_supplicant_required): False 42613 1727204619.58996: when evaluation is False, skipping this task 42613 1727204619.59002: _execute() done 42613 1727204619.59018: dumping result to json 42613 1727204619.59026: done dumping result, returning 42613 1727204619.59038: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-2f91-05d8-00000000009e] 42613 1727204619.59049: sending task result for task 127b8e07-fff9-2f91-05d8-00000000009e skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 42613 1727204619.59322: no more pending results, returning what we have 42613 1727204619.59326: results queue empty 42613 1727204619.59328: checking for any_errors_fatal 42613 1727204619.59348: done checking for any_errors_fatal 42613 1727204619.59349: checking for max_fail_percentage 42613 1727204619.59352: done checking for max_fail_percentage 42613 1727204619.59353: checking to see if all hosts have failed and the running result is not ok 42613 1727204619.59355: done checking to see if all hosts have failed 42613 1727204619.59355: getting the remaining hosts for this loop 42613 1727204619.59357: done getting the remaining hosts for this loop 42613 1727204619.59362: getting the next task for host managed-node3 42613 1727204619.59371: done getting next task for host managed-node3 42613 1727204619.59376: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 42613 1727204619.59471: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204619.59490: getting variables 42613 1727204619.59492: in VariableManager get_vars() 42613 1727204619.59532: Calling all_inventory to load vars for managed-node3 42613 1727204619.59535: Calling groups_inventory to load vars for managed-node3 42613 1727204619.59537: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204619.59551: Calling all_plugins_play to load vars for managed-node3 42613 1727204619.59554: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204619.59557: Calling groups_plugins_play to load vars for managed-node3 42613 1727204619.60075: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000009e 42613 1727204619.60080: WORKER PROCESS EXITING 42613 1727204619.61590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204619.63961: done with get_vars() 42613 1727204619.64004: done getting variables 42613 1727204619.64084: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.124) 0:00:48.249 ***** 42613 1727204619.64118: entering _queue_task() for managed-node3/service 42613 1727204619.64527: worker is 1 (out of 1 available) 42613 1727204619.64542: exiting _queue_task() for managed-node3/service 42613 1727204619.64557: done queuing things up, now waiting for results queue to drain 42613 1727204619.64559: waiting for pending results... 42613 1727204619.64898: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 42613 1727204619.65042: in run() - task 127b8e07-fff9-2f91-05d8-00000000009f 42613 1727204619.65068: variable 'ansible_search_path' from source: unknown 42613 1727204619.65078: variable 'ansible_search_path' from source: unknown 42613 1727204619.65172: calling self._execute() 42613 1727204619.65253: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204619.65270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204619.65288: variable 'omit' from source: magic vars 42613 1727204619.65732: variable 'ansible_distribution_major_version' from source: facts 42613 1727204619.65751: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204619.65896: variable 'network_provider' from source: set_fact 42613 1727204619.65970: Evaluated conditional (network_provider == "initscripts"): False 42613 1727204619.65974: when evaluation is False, skipping this task 42613 1727204619.65977: _execute() done 42613 1727204619.65980: dumping result to json 42613 1727204619.65982: done dumping result, returning 42613 1727204619.65985: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-2f91-05d8-00000000009f] 42613 1727204619.65987: sending task result for task 127b8e07-fff9-2f91-05d8-00000000009f 42613 1727204619.66279: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000009f 42613 1727204619.66283: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 42613 1727204619.66332: no more pending results, returning what we have 42613 1727204619.66336: results queue empty 42613 1727204619.66337: checking for any_errors_fatal 42613 1727204619.66344: done checking for any_errors_fatal 42613 1727204619.66344: checking for max_fail_percentage 42613 1727204619.66347: done checking for max_fail_percentage 42613 1727204619.66347: checking to see if all hosts have failed and the running result is not ok 42613 1727204619.66348: done checking to see if all hosts have failed 42613 1727204619.66349: getting the remaining hosts for this loop 42613 1727204619.66351: done getting the remaining hosts for this loop 42613 1727204619.66355: getting the next task for host managed-node3 42613 1727204619.66361: done getting next task for host managed-node3 42613 1727204619.66367: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 42613 1727204619.66370: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204619.66387: getting variables 42613 1727204619.66389: in VariableManager get_vars() 42613 1727204619.66437: Calling all_inventory to load vars for managed-node3 42613 1727204619.66441: Calling groups_inventory to load vars for managed-node3 42613 1727204619.66443: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204619.66457: Calling all_plugins_play to load vars for managed-node3 42613 1727204619.66460: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204619.66464: Calling groups_plugins_play to load vars for managed-node3 42613 1727204619.68492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204619.71936: done with get_vars() 42613 1727204619.71989: done getting variables 42613 1727204619.72057: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.079) 0:00:48.329 ***** 42613 1727204619.72100: entering _queue_task() for managed-node3/copy 42613 1727204619.72505: worker is 1 (out of 1 available) 42613 1727204619.72528: exiting _queue_task() for managed-node3/copy 42613 1727204619.72540: done queuing things up, now waiting for results queue to drain 42613 1727204619.72542: waiting for pending results... 42613 1727204619.72778: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 42613 1727204619.72914: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a0 42613 1727204619.72937: variable 'ansible_search_path' from source: unknown 42613 1727204619.72951: variable 'ansible_search_path' from source: unknown 42613 1727204619.72999: calling self._execute() 42613 1727204619.73111: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204619.73126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204619.73144: variable 'omit' from source: magic vars 42613 1727204619.73584: variable 'ansible_distribution_major_version' from source: facts 42613 1727204619.73671: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204619.73743: variable 'network_provider' from source: set_fact 42613 1727204619.73757: Evaluated conditional (network_provider == "initscripts"): False 42613 1727204619.73767: when evaluation is False, skipping this task 42613 1727204619.73775: _execute() done 42613 1727204619.73784: dumping result to json 42613 1727204619.73793: done dumping result, returning 42613 1727204619.73807: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-2f91-05d8-0000000000a0] 42613 1727204619.73820: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a0 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 42613 1727204619.74003: no more pending results, returning what we have 42613 1727204619.74008: results queue empty 42613 1727204619.74009: checking for any_errors_fatal 42613 1727204619.74016: done checking for any_errors_fatal 42613 1727204619.74017: checking for max_fail_percentage 42613 1727204619.74019: done checking for max_fail_percentage 42613 1727204619.74020: checking to see if all hosts have failed and the running result is not ok 42613 1727204619.74021: done checking to see if all hosts have failed 42613 1727204619.74022: getting the remaining hosts for this loop 42613 1727204619.74025: done getting the remaining hosts for this loop 42613 1727204619.74029: getting the next task for host managed-node3 42613 1727204619.74037: done getting next task for host managed-node3 42613 1727204619.74044: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 42613 1727204619.74046: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204619.74061: getting variables 42613 1727204619.74063: in VariableManager get_vars() 42613 1727204619.74114: Calling all_inventory to load vars for managed-node3 42613 1727204619.74117: Calling groups_inventory to load vars for managed-node3 42613 1727204619.74120: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204619.74135: Calling all_plugins_play to load vars for managed-node3 42613 1727204619.74142: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204619.74146: Calling groups_plugins_play to load vars for managed-node3 42613 1727204619.75185: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a0 42613 1727204619.75189: WORKER PROCESS EXITING 42613 1727204619.76455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204619.78709: done with get_vars() 42613 1727204619.78756: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.067) 0:00:48.396 ***** 42613 1727204619.78851: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 42613 1727204619.79252: worker is 1 (out of 1 available) 42613 1727204619.79268: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 42613 1727204619.79282: done queuing things up, now waiting for results queue to drain 42613 1727204619.79283: waiting for pending results... 42613 1727204619.79613: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 42613 1727204619.79753: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a1 42613 1727204619.79781: variable 'ansible_search_path' from source: unknown 42613 1727204619.79789: variable 'ansible_search_path' from source: unknown 42613 1727204619.79835: calling self._execute() 42613 1727204619.79960: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204619.79976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204619.79990: variable 'omit' from source: magic vars 42613 1727204619.80443: variable 'ansible_distribution_major_version' from source: facts 42613 1727204619.80467: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204619.80480: variable 'omit' from source: magic vars 42613 1727204619.80532: variable 'omit' from source: magic vars 42613 1727204619.80736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 42613 1727204619.83276: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 42613 1727204619.83373: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 42613 1727204619.83445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 42613 1727204619.83468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 42613 1727204619.83551: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 42613 1727204619.83608: variable 'network_provider' from source: set_fact 42613 1727204619.83777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 42613 1727204619.83831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 42613 1727204619.83869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 42613 1727204619.83922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 42613 1727204619.83944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 42613 1727204619.84032: variable 'omit' from source: magic vars 42613 1727204619.84203: variable 'omit' from source: magic vars 42613 1727204619.84293: variable 'network_connections' from source: play vars 42613 1727204619.84313: variable 'profile' from source: play vars 42613 1727204619.84386: variable 'profile' from source: play vars 42613 1727204619.84395: variable 'interface' from source: set_fact 42613 1727204619.84875: variable 'interface' from source: set_fact 42613 1727204619.84897: variable 'omit' from source: magic vars 42613 1727204619.84911: variable '__lsr_ansible_managed' from source: task vars 42613 1727204619.85096: variable '__lsr_ansible_managed' from source: task vars 42613 1727204619.85748: Loaded config def from plugin (lookup/template) 42613 1727204619.85753: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 42613 1727204619.85790: File lookup term: get_ansible_managed.j2 42613 1727204619.85861: variable 'ansible_search_path' from source: unknown 42613 1727204619.85876: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 42613 1727204619.85895: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 42613 1727204619.85918: variable 'ansible_search_path' from source: unknown 42613 1727204620.05558: variable 'ansible_managed' from source: unknown 42613 1727204620.05827: variable 'omit' from source: magic vars 42613 1727204620.05881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204620.05910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204620.05928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204620.05950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204620.05968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204620.05998: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204620.06007: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204620.06015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204620.06131: Set connection var ansible_shell_executable to /bin/sh 42613 1727204620.06144: Set connection var ansible_pipelining to False 42613 1727204620.06157: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204620.06163: Set connection var ansible_connection to ssh 42613 1727204620.06179: Set connection var ansible_timeout to 10 42613 1727204620.06190: Set connection var ansible_shell_type to sh 42613 1727204620.06220: variable 'ansible_shell_executable' from source: unknown 42613 1727204620.06227: variable 'ansible_connection' from source: unknown 42613 1727204620.06234: variable 'ansible_module_compression' from source: unknown 42613 1727204620.06244: variable 'ansible_shell_type' from source: unknown 42613 1727204620.06252: variable 'ansible_shell_executable' from source: unknown 42613 1727204620.06258: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204620.06268: variable 'ansible_pipelining' from source: unknown 42613 1727204620.06276: variable 'ansible_timeout' from source: unknown 42613 1727204620.06283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204620.06427: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204620.06454: variable 'omit' from source: magic vars 42613 1727204620.06464: starting attempt loop 42613 1727204620.06475: running the handler 42613 1727204620.06488: _low_level_execute_command(): starting 42613 1727204620.06510: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204620.07263: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204620.07289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204620.07304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204620.07321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204620.07396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204620.07444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204620.07487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204620.07693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204620.07838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204620.09668: stdout chunk (state=3): >>>/root <<< 42613 1727204620.09854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204620.09873: stdout chunk (state=3): >>><<< 42613 1727204620.09891: stderr chunk (state=3): >>><<< 42613 1727204620.09924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204620.09945: _low_level_execute_command(): starting 42613 1727204620.09956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638 `" && echo ansible-tmp-1727204620.0993059-45883-109698636753638="` echo /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638 `" ) && sleep 0' 42613 1727204620.11190: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204620.11205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204620.11226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204620.11246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204620.11419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204620.13537: stdout chunk (state=3): >>>ansible-tmp-1727204620.0993059-45883-109698636753638=/root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638 <<< 42613 1727204620.13677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204620.13777: stderr chunk (state=3): >>><<< 42613 1727204620.13788: stdout chunk (state=3): >>><<< 42613 1727204620.13813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204620.0993059-45883-109698636753638=/root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204620.13885: variable 'ansible_module_compression' from source: unknown 42613 1727204620.13934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 42613 1727204620.14183: variable 'ansible_facts' from source: unknown 42613 1727204620.14187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/AnsiballZ_network_connections.py 42613 1727204620.14896: Sending initial data 42613 1727204620.14900: Sent initial data (168 bytes) 42613 1727204620.16182: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 42613 1727204620.16390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204620.16419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204620.16594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204620.16703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204620.18500: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204620.18598: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204620.18684: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpc7j6v5yn /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/AnsiballZ_network_connections.py <<< 42613 1727204620.18690: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/AnsiballZ_network_connections.py" <<< 42613 1727204620.18739: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpc7j6v5yn" to remote "/root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/AnsiballZ_network_connections.py" <<< 42613 1727204620.21197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204620.21355: stderr chunk (state=3): >>><<< 42613 1727204620.21358: stdout chunk (state=3): >>><<< 42613 1727204620.21360: done transferring module to remote 42613 1727204620.21363: _low_level_execute_command(): starting 42613 1727204620.21367: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/ /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/AnsiballZ_network_connections.py && sleep 0' 42613 1727204620.22694: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204620.22752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204620.24806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204620.24890: stderr chunk (state=3): >>><<< 42613 1727204620.24899: stdout chunk (state=3): >>><<< 42613 1727204620.24921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204620.24929: _low_level_execute_command(): starting 42613 1727204620.24975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/AnsiballZ_network_connections.py && sleep 0' 42613 1727204620.25791: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204620.25807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204620.25904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204620.26077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204620.26213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204620.58116: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 42613 1727204620.58155: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hmht3yrp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hmht3yrp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 42613 1727204620.58186: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/4b766b42-9d6e-4768-952e-daf596460f29: error=unknown <<< 42613 1727204620.58378: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 42613 1727204620.60595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204620.60668: stderr chunk (state=3): >>><<< 42613 1727204620.60690: stdout chunk (state=3): >>><<< 42613 1727204620.60843: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hmht3yrp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hmht3yrp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/4b766b42-9d6e-4768-952e-daf596460f29: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204620.60847: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204620.60850: _low_level_execute_command(): starting 42613 1727204620.60852: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204620.0993059-45883-109698636753638/ > /dev/null 2>&1 && sleep 0' 42613 1727204620.61474: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204620.61682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204620.61883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204620.62101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204620.64147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204620.64208: stderr chunk (state=3): >>><<< 42613 1727204620.64212: stdout chunk (state=3): >>><<< 42613 1727204620.64227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204620.64234: handler run complete 42613 1727204620.64262: attempt loop complete, returning result 42613 1727204620.64267: _execute() done 42613 1727204620.64270: dumping result to json 42613 1727204620.64272: done dumping result, returning 42613 1727204620.64280: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-2f91-05d8-0000000000a1] 42613 1727204620.64288: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a1 42613 1727204620.64392: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a1 42613 1727204620.64395: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 42613 1727204620.64494: no more pending results, returning what we have 42613 1727204620.64498: results queue empty 42613 1727204620.64499: checking for any_errors_fatal 42613 1727204620.64508: done checking for any_errors_fatal 42613 1727204620.64509: checking for max_fail_percentage 42613 1727204620.64511: done checking for max_fail_percentage 42613 1727204620.64512: checking to see if all hosts have failed and the running result is not ok 42613 1727204620.64512: done checking to see if all hosts have failed 42613 1727204620.64513: getting the remaining hosts for this loop 42613 1727204620.64515: done getting the remaining hosts for this loop 42613 1727204620.64519: getting the next task for host managed-node3 42613 1727204620.64524: done getting next task for host managed-node3 42613 1727204620.64528: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 42613 1727204620.64530: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204620.64539: getting variables 42613 1727204620.64542: in VariableManager get_vars() 42613 1727204620.64587: Calling all_inventory to load vars for managed-node3 42613 1727204620.64590: Calling groups_inventory to load vars for managed-node3 42613 1727204620.64592: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204620.64602: Calling all_plugins_play to load vars for managed-node3 42613 1727204620.64605: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204620.64608: Calling groups_plugins_play to load vars for managed-node3 42613 1727204620.68032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204620.70644: done with get_vars() 42613 1727204620.70755: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.920) 0:00:49.317 ***** 42613 1727204620.70953: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 42613 1727204620.71488: worker is 1 (out of 1 available) 42613 1727204620.71502: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 42613 1727204620.71514: done queuing things up, now waiting for results queue to drain 42613 1727204620.71516: waiting for pending results... 42613 1727204620.72050: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 42613 1727204620.72056: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a2 42613 1727204620.72059: variable 'ansible_search_path' from source: unknown 42613 1727204620.72062: variable 'ansible_search_path' from source: unknown 42613 1727204620.72107: calling self._execute() 42613 1727204620.72384: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204620.72389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204620.72430: variable 'omit' from source: magic vars 42613 1727204620.73269: variable 'ansible_distribution_major_version' from source: facts 42613 1727204620.73273: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204620.73594: variable 'network_state' from source: role '' defaults 42613 1727204620.73618: Evaluated conditional (network_state != {}): False 42613 1727204620.73629: when evaluation is False, skipping this task 42613 1727204620.73637: _execute() done 42613 1727204620.73645: dumping result to json 42613 1727204620.73653: done dumping result, returning 42613 1727204620.73667: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-2f91-05d8-0000000000a2] 42613 1727204620.73680: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a2 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 42613 1727204620.73989: no more pending results, returning what we have 42613 1727204620.73997: results queue empty 42613 1727204620.73999: checking for any_errors_fatal 42613 1727204620.74012: done checking for any_errors_fatal 42613 1727204620.74014: checking for max_fail_percentage 42613 1727204620.74017: done checking for max_fail_percentage 42613 1727204620.74018: checking to see if all hosts have failed and the running result is not ok 42613 1727204620.74019: done checking to see if all hosts have failed 42613 1727204620.74023: getting the remaining hosts for this loop 42613 1727204620.74025: done getting the remaining hosts for this loop 42613 1727204620.74030: getting the next task for host managed-node3 42613 1727204620.74039: done getting next task for host managed-node3 42613 1727204620.74043: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 42613 1727204620.74047: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204620.74069: getting variables 42613 1727204620.74073: in VariableManager get_vars() 42613 1727204620.74122: Calling all_inventory to load vars for managed-node3 42613 1727204620.74125: Calling groups_inventory to load vars for managed-node3 42613 1727204620.74128: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204620.74144: Calling all_plugins_play to load vars for managed-node3 42613 1727204620.74147: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204620.74151: Calling groups_plugins_play to load vars for managed-node3 42613 1727204620.74931: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a2 42613 1727204620.74935: WORKER PROCESS EXITING 42613 1727204620.77530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204620.80803: done with get_vars() 42613 1727204620.80861: done getting variables 42613 1727204620.81025: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.101) 0:00:49.418 ***** 42613 1727204620.81064: entering _queue_task() for managed-node3/debug 42613 1727204620.81837: worker is 1 (out of 1 available) 42613 1727204620.81852: exiting _queue_task() for managed-node3/debug 42613 1727204620.81868: done queuing things up, now waiting for results queue to drain 42613 1727204620.81869: waiting for pending results... 42613 1727204620.82195: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 42613 1727204620.82328: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a3 42613 1727204620.82354: variable 'ansible_search_path' from source: unknown 42613 1727204620.82362: variable 'ansible_search_path' from source: unknown 42613 1727204620.82408: calling self._execute() 42613 1727204620.82635: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204620.82650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204620.82664: variable 'omit' from source: magic vars 42613 1727204620.83191: variable 'ansible_distribution_major_version' from source: facts 42613 1727204620.83282: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204620.83291: variable 'omit' from source: magic vars 42613 1727204620.83486: variable 'omit' from source: magic vars 42613 1727204620.83490: variable 'omit' from source: magic vars 42613 1727204620.83493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204620.83527: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204620.83559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204620.83644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204620.83681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204620.83722: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204620.83732: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204620.83743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204620.83872: Set connection var ansible_shell_executable to /bin/sh 42613 1727204620.83901: Set connection var ansible_pipelining to False 42613 1727204620.83918: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204620.83926: Set connection var ansible_connection to ssh 42613 1727204620.83935: Set connection var ansible_timeout to 10 42613 1727204620.83941: Set connection var ansible_shell_type to sh 42613 1727204620.83974: variable 'ansible_shell_executable' from source: unknown 42613 1727204620.83984: variable 'ansible_connection' from source: unknown 42613 1727204620.83991: variable 'ansible_module_compression' from source: unknown 42613 1727204620.83997: variable 'ansible_shell_type' from source: unknown 42613 1727204620.84003: variable 'ansible_shell_executable' from source: unknown 42613 1727204620.84009: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204620.84017: variable 'ansible_pipelining' from source: unknown 42613 1727204620.84029: variable 'ansible_timeout' from source: unknown 42613 1727204620.84038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204620.84479: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204620.84483: variable 'omit' from source: magic vars 42613 1727204620.84485: starting attempt loop 42613 1727204620.84488: running the handler 42613 1727204620.84760: variable '__network_connections_result' from source: set_fact 42613 1727204620.84829: handler run complete 42613 1727204620.84887: attempt loop complete, returning result 42613 1727204620.84895: _execute() done 42613 1727204620.84902: dumping result to json 42613 1727204620.84909: done dumping result, returning 42613 1727204620.84922: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-2f91-05d8-0000000000a3] 42613 1727204620.84948: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a3 42613 1727204620.85306: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a3 42613 1727204620.85312: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 42613 1727204620.85406: no more pending results, returning what we have 42613 1727204620.85412: results queue empty 42613 1727204620.85413: checking for any_errors_fatal 42613 1727204620.85425: done checking for any_errors_fatal 42613 1727204620.85427: checking for max_fail_percentage 42613 1727204620.85429: done checking for max_fail_percentage 42613 1727204620.85430: checking to see if all hosts have failed and the running result is not ok 42613 1727204620.85431: done checking to see if all hosts have failed 42613 1727204620.85432: getting the remaining hosts for this loop 42613 1727204620.85434: done getting the remaining hosts for this loop 42613 1727204620.85438: getting the next task for host managed-node3 42613 1727204620.85445: done getting next task for host managed-node3 42613 1727204620.85449: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 42613 1727204620.85451: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204620.85463: getting variables 42613 1727204620.85464: in VariableManager get_vars() 42613 1727204620.85510: Calling all_inventory to load vars for managed-node3 42613 1727204620.85513: Calling groups_inventory to load vars for managed-node3 42613 1727204620.85515: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204620.85527: Calling all_plugins_play to load vars for managed-node3 42613 1727204620.85531: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204620.85534: Calling groups_plugins_play to load vars for managed-node3 42613 1727204620.98978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.01441: done with get_vars() 42613 1727204621.01486: done getting variables 42613 1727204621.01555: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.205) 0:00:49.624 ***** 42613 1727204621.01610: entering _queue_task() for managed-node3/debug 42613 1727204621.02084: worker is 1 (out of 1 available) 42613 1727204621.02099: exiting _queue_task() for managed-node3/debug 42613 1727204621.02114: done queuing things up, now waiting for results queue to drain 42613 1727204621.02117: waiting for pending results... 42613 1727204621.02363: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 42613 1727204621.02460: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a4 42613 1727204621.02475: variable 'ansible_search_path' from source: unknown 42613 1727204621.02481: variable 'ansible_search_path' from source: unknown 42613 1727204621.02511: calling self._execute() 42613 1727204621.02597: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.02605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.02613: variable 'omit' from source: magic vars 42613 1727204621.03048: variable 'ansible_distribution_major_version' from source: facts 42613 1727204621.03060: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204621.03070: variable 'omit' from source: magic vars 42613 1727204621.03131: variable 'omit' from source: magic vars 42613 1727204621.03171: variable 'omit' from source: magic vars 42613 1727204621.03217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204621.03249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204621.03272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204621.03286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204621.03313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204621.03336: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204621.03340: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.03345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.03440: Set connection var ansible_shell_executable to /bin/sh 42613 1727204621.03448: Set connection var ansible_pipelining to False 42613 1727204621.03456: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204621.03459: Set connection var ansible_connection to ssh 42613 1727204621.03467: Set connection var ansible_timeout to 10 42613 1727204621.03469: Set connection var ansible_shell_type to sh 42613 1727204621.03493: variable 'ansible_shell_executable' from source: unknown 42613 1727204621.03496: variable 'ansible_connection' from source: unknown 42613 1727204621.03499: variable 'ansible_module_compression' from source: unknown 42613 1727204621.03501: variable 'ansible_shell_type' from source: unknown 42613 1727204621.03504: variable 'ansible_shell_executable' from source: unknown 42613 1727204621.03507: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.03509: variable 'ansible_pipelining' from source: unknown 42613 1727204621.03512: variable 'ansible_timeout' from source: unknown 42613 1727204621.03516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.03697: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204621.03849: variable 'omit' from source: magic vars 42613 1727204621.03853: starting attempt loop 42613 1727204621.03856: running the handler 42613 1727204621.03858: variable '__network_connections_result' from source: set_fact 42613 1727204621.04181: variable '__network_connections_result' from source: set_fact 42613 1727204621.04185: handler run complete 42613 1727204621.04188: attempt loop complete, returning result 42613 1727204621.04190: _execute() done 42613 1727204621.04193: dumping result to json 42613 1727204621.04195: done dumping result, returning 42613 1727204621.04244: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-2f91-05d8-0000000000a4] 42613 1727204621.04257: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a4 42613 1727204621.04804: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a4 42613 1727204621.04809: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 42613 1727204621.04913: no more pending results, returning what we have 42613 1727204621.04917: results queue empty 42613 1727204621.04921: checking for any_errors_fatal 42613 1727204621.04934: done checking for any_errors_fatal 42613 1727204621.04935: checking for max_fail_percentage 42613 1727204621.04937: done checking for max_fail_percentage 42613 1727204621.04938: checking to see if all hosts have failed and the running result is not ok 42613 1727204621.04939: done checking to see if all hosts have failed 42613 1727204621.04940: getting the remaining hosts for this loop 42613 1727204621.04943: done getting the remaining hosts for this loop 42613 1727204621.04947: getting the next task for host managed-node3 42613 1727204621.04955: done getting next task for host managed-node3 42613 1727204621.04960: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 42613 1727204621.04962: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.04977: getting variables 42613 1727204621.04979: in VariableManager get_vars() 42613 1727204621.05025: Calling all_inventory to load vars for managed-node3 42613 1727204621.05028: Calling groups_inventory to load vars for managed-node3 42613 1727204621.05031: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.05044: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.05048: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.05052: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.08870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.11787: done with get_vars() 42613 1727204621.11830: done getting variables 42613 1727204621.11899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.103) 0:00:49.727 ***** 42613 1727204621.11933: entering _queue_task() for managed-node3/debug 42613 1727204621.12224: worker is 1 (out of 1 available) 42613 1727204621.12240: exiting _queue_task() for managed-node3/debug 42613 1727204621.12252: done queuing things up, now waiting for results queue to drain 42613 1727204621.12253: waiting for pending results... 42613 1727204621.12465: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 42613 1727204621.12554: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a5 42613 1727204621.12574: variable 'ansible_search_path' from source: unknown 42613 1727204621.12578: variable 'ansible_search_path' from source: unknown 42613 1727204621.12613: calling self._execute() 42613 1727204621.12702: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.12708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.12717: variable 'omit' from source: magic vars 42613 1727204621.13043: variable 'ansible_distribution_major_version' from source: facts 42613 1727204621.13057: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204621.13152: variable 'network_state' from source: role '' defaults 42613 1727204621.13161: Evaluated conditional (network_state != {}): False 42613 1727204621.13165: when evaluation is False, skipping this task 42613 1727204621.13171: _execute() done 42613 1727204621.13174: dumping result to json 42613 1727204621.13176: done dumping result, returning 42613 1727204621.13183: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-2f91-05d8-0000000000a5] 42613 1727204621.13189: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a5 42613 1727204621.13291: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a5 42613 1727204621.13294: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 42613 1727204621.13341: no more pending results, returning what we have 42613 1727204621.13345: results queue empty 42613 1727204621.13346: checking for any_errors_fatal 42613 1727204621.13355: done checking for any_errors_fatal 42613 1727204621.13356: checking for max_fail_percentage 42613 1727204621.13359: done checking for max_fail_percentage 42613 1727204621.13360: checking to see if all hosts have failed and the running result is not ok 42613 1727204621.13361: done checking to see if all hosts have failed 42613 1727204621.13361: getting the remaining hosts for this loop 42613 1727204621.13363: done getting the remaining hosts for this loop 42613 1727204621.13369: getting the next task for host managed-node3 42613 1727204621.13376: done getting next task for host managed-node3 42613 1727204621.13380: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 42613 1727204621.13382: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.13398: getting variables 42613 1727204621.13399: in VariableManager get_vars() 42613 1727204621.13439: Calling all_inventory to load vars for managed-node3 42613 1727204621.13442: Calling groups_inventory to load vars for managed-node3 42613 1727204621.13444: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.13454: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.13457: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.13460: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.14507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.15849: done with get_vars() 42613 1727204621.15875: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.040) 0:00:49.767 ***** 42613 1727204621.15950: entering _queue_task() for managed-node3/ping 42613 1727204621.16233: worker is 1 (out of 1 available) 42613 1727204621.16246: exiting _queue_task() for managed-node3/ping 42613 1727204621.16259: done queuing things up, now waiting for results queue to drain 42613 1727204621.16261: waiting for pending results... 42613 1727204621.16469: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 42613 1727204621.16549: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a6 42613 1727204621.16562: variable 'ansible_search_path' from source: unknown 42613 1727204621.16571: variable 'ansible_search_path' from source: unknown 42613 1727204621.16605: calling self._execute() 42613 1727204621.16691: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.16697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.16708: variable 'omit' from source: magic vars 42613 1727204621.17136: variable 'ansible_distribution_major_version' from source: facts 42613 1727204621.17143: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204621.17146: variable 'omit' from source: magic vars 42613 1727204621.17184: variable 'omit' from source: magic vars 42613 1727204621.17214: variable 'omit' from source: magic vars 42613 1727204621.17255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204621.17289: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204621.17309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204621.17324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204621.17334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204621.17362: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204621.17367: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.17370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.17482: Set connection var ansible_shell_executable to /bin/sh 42613 1727204621.17486: Set connection var ansible_pipelining to False 42613 1727204621.17494: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204621.17496: Set connection var ansible_connection to ssh 42613 1727204621.17502: Set connection var ansible_timeout to 10 42613 1727204621.17504: Set connection var ansible_shell_type to sh 42613 1727204621.17526: variable 'ansible_shell_executable' from source: unknown 42613 1727204621.17531: variable 'ansible_connection' from source: unknown 42613 1727204621.17534: variable 'ansible_module_compression' from source: unknown 42613 1727204621.17537: variable 'ansible_shell_type' from source: unknown 42613 1727204621.17542: variable 'ansible_shell_executable' from source: unknown 42613 1727204621.17545: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.17547: variable 'ansible_pipelining' from source: unknown 42613 1727204621.17550: variable 'ansible_timeout' from source: unknown 42613 1727204621.17552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.17724: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204621.17735: variable 'omit' from source: magic vars 42613 1727204621.17739: starting attempt loop 42613 1727204621.17744: running the handler 42613 1727204621.17756: _low_level_execute_command(): starting 42613 1727204621.17762: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204621.18337: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.18344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.18347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.18350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.18407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204621.18411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204621.18414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204621.18498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204621.20350: stdout chunk (state=3): >>>/root <<< 42613 1727204621.20488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204621.20520: stderr chunk (state=3): >>><<< 42613 1727204621.20524: stdout chunk (state=3): >>><<< 42613 1727204621.20546: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204621.20560: _low_level_execute_command(): starting 42613 1727204621.20568: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432 `" && echo ansible-tmp-1727204621.205464-46043-212643550776432="` echo /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432 `" ) && sleep 0' 42613 1727204621.21052: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.21086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.21098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204621.21101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.21150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204621.21154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204621.21156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204621.21237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204621.23413: stdout chunk (state=3): >>>ansible-tmp-1727204621.205464-46043-212643550776432=/root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432 <<< 42613 1727204621.23526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204621.23599: stderr chunk (state=3): >>><<< 42613 1727204621.23603: stdout chunk (state=3): >>><<< 42613 1727204621.23619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204621.205464-46043-212643550776432=/root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204621.23677: variable 'ansible_module_compression' from source: unknown 42613 1727204621.23715: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 42613 1727204621.23754: variable 'ansible_facts' from source: unknown 42613 1727204621.23819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/AnsiballZ_ping.py 42613 1727204621.23939: Sending initial data 42613 1727204621.23945: Sent initial data (152 bytes) 42613 1727204621.24465: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.24470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.24473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.24475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.24534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204621.24537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204621.24622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204621.26408: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204621.26496: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204621.26564: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp2ocx9j0q /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/AnsiballZ_ping.py <<< 42613 1727204621.26577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/AnsiballZ_ping.py" <<< 42613 1727204621.26689: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp2ocx9j0q" to remote "/root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/AnsiballZ_ping.py" <<< 42613 1727204621.27674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204621.27679: stderr chunk (state=3): >>><<< 42613 1727204621.27681: stdout chunk (state=3): >>><<< 42613 1727204621.27684: done transferring module to remote 42613 1727204621.27686: _low_level_execute_command(): starting 42613 1727204621.27689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/ /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/AnsiballZ_ping.py && sleep 0' 42613 1727204621.28303: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204621.28385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204621.28394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.28460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204621.28464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204621.28478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204621.28559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204621.30776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204621.30781: stdout chunk (state=3): >>><<< 42613 1727204621.30784: stderr chunk (state=3): >>><<< 42613 1727204621.30787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204621.30789: _low_level_execute_command(): starting 42613 1727204621.30791: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/AnsiballZ_ping.py && sleep 0' 42613 1727204621.31322: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.31340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.31383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204621.31396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204621.31485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204621.48824: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 42613 1727204621.50498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204621.50502: stdout chunk (state=3): >>><<< 42613 1727204621.50570: stderr chunk (state=3): >>><<< 42613 1727204621.50577: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204621.50580: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204621.50583: _low_level_execute_command(): starting 42613 1727204621.50585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204621.205464-46043-212643550776432/ > /dev/null 2>&1 && sleep 0' 42613 1727204621.51208: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204621.51216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204621.51229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.51247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204621.51259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204621.51269: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204621.51279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.51292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204621.51300: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204621.51307: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204621.51315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204621.51326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204621.51338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204621.51449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204621.51454: stderr chunk (state=3): >>>debug2: match found <<< 42613 1727204621.51456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.51459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204621.51469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204621.51489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204621.51576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204621.53791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204621.53828: stderr chunk (state=3): >>><<< 42613 1727204621.53831: stdout chunk (state=3): >>><<< 42613 1727204621.54175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204621.54179: handler run complete 42613 1727204621.54182: attempt loop complete, returning result 42613 1727204621.54184: _execute() done 42613 1727204621.54187: dumping result to json 42613 1727204621.54189: done dumping result, returning 42613 1727204621.54191: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-2f91-05d8-0000000000a6] 42613 1727204621.54194: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a6 42613 1727204621.54275: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a6 42613 1727204621.54278: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 42613 1727204621.54339: no more pending results, returning what we have 42613 1727204621.54345: results queue empty 42613 1727204621.54345: checking for any_errors_fatal 42613 1727204621.54356: done checking for any_errors_fatal 42613 1727204621.54357: checking for max_fail_percentage 42613 1727204621.54359: done checking for max_fail_percentage 42613 1727204621.54360: checking to see if all hosts have failed and the running result is not ok 42613 1727204621.54361: done checking to see if all hosts have failed 42613 1727204621.54361: getting the remaining hosts for this loop 42613 1727204621.54363: done getting the remaining hosts for this loop 42613 1727204621.54370: getting the next task for host managed-node3 42613 1727204621.54380: done getting next task for host managed-node3 42613 1727204621.54382: ^ task is: TASK: meta (role_complete) 42613 1727204621.54384: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.54396: getting variables 42613 1727204621.54397: in VariableManager get_vars() 42613 1727204621.54444: Calling all_inventory to load vars for managed-node3 42613 1727204621.54447: Calling groups_inventory to load vars for managed-node3 42613 1727204621.54449: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.54462: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.54788: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.54797: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.59247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.64796: done with get_vars() 42613 1727204621.64848: done getting variables 42613 1727204621.65191: done queuing things up, now waiting for results queue to drain 42613 1727204621.65194: results queue empty 42613 1727204621.65194: checking for any_errors_fatal 42613 1727204621.65198: done checking for any_errors_fatal 42613 1727204621.65199: checking for max_fail_percentage 42613 1727204621.65200: done checking for max_fail_percentage 42613 1727204621.65201: checking to see if all hosts have failed and the running result is not ok 42613 1727204621.65202: done checking to see if all hosts have failed 42613 1727204621.65203: getting the remaining hosts for this loop 42613 1727204621.65204: done getting the remaining hosts for this loop 42613 1727204621.65207: getting the next task for host managed-node3 42613 1727204621.65211: done getting next task for host managed-node3 42613 1727204621.65213: ^ task is: TASK: meta (flush_handlers) 42613 1727204621.65215: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.65218: getting variables 42613 1727204621.65219: in VariableManager get_vars() 42613 1727204621.65236: Calling all_inventory to load vars for managed-node3 42613 1727204621.65238: Calling groups_inventory to load vars for managed-node3 42613 1727204621.65243: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.65249: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.65251: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.65254: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.69507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.73991: done with get_vars() 42613 1727204621.74029: done getting variables 42613 1727204621.74098: in VariableManager get_vars() 42613 1727204621.74114: Calling all_inventory to load vars for managed-node3 42613 1727204621.74117: Calling groups_inventory to load vars for managed-node3 42613 1727204621.74119: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.74125: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.74127: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.74130: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.75797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.79686: done with get_vars() 42613 1727204621.79724: done queuing things up, now waiting for results queue to drain 42613 1727204621.79726: results queue empty 42613 1727204621.79727: checking for any_errors_fatal 42613 1727204621.79729: done checking for any_errors_fatal 42613 1727204621.79730: checking for max_fail_percentage 42613 1727204621.79731: done checking for max_fail_percentage 42613 1727204621.79732: checking to see if all hosts have failed and the running result is not ok 42613 1727204621.79732: done checking to see if all hosts have failed 42613 1727204621.79733: getting the remaining hosts for this loop 42613 1727204621.79734: done getting the remaining hosts for this loop 42613 1727204621.79737: getting the next task for host managed-node3 42613 1727204621.79743: done getting next task for host managed-node3 42613 1727204621.79745: ^ task is: TASK: meta (flush_handlers) 42613 1727204621.79747: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.79755: getting variables 42613 1727204621.79757: in VariableManager get_vars() 42613 1727204621.79773: Calling all_inventory to load vars for managed-node3 42613 1727204621.79776: Calling groups_inventory to load vars for managed-node3 42613 1727204621.79778: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.79784: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.79787: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.79790: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.81767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.84220: done with get_vars() 42613 1727204621.84267: done getting variables 42613 1727204621.84337: in VariableManager get_vars() 42613 1727204621.84356: Calling all_inventory to load vars for managed-node3 42613 1727204621.84359: Calling groups_inventory to load vars for managed-node3 42613 1727204621.84361: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.84369: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.84372: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.84375: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.87252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.90359: done with get_vars() 42613 1727204621.90421: done queuing things up, now waiting for results queue to drain 42613 1727204621.90425: results queue empty 42613 1727204621.90426: checking for any_errors_fatal 42613 1727204621.90428: done checking for any_errors_fatal 42613 1727204621.90429: checking for max_fail_percentage 42613 1727204621.90430: done checking for max_fail_percentage 42613 1727204621.90431: checking to see if all hosts have failed and the running result is not ok 42613 1727204621.90432: done checking to see if all hosts have failed 42613 1727204621.90433: getting the remaining hosts for this loop 42613 1727204621.90434: done getting the remaining hosts for this loop 42613 1727204621.90445: getting the next task for host managed-node3 42613 1727204621.90450: done getting next task for host managed-node3 42613 1727204621.90451: ^ task is: None 42613 1727204621.90452: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.90453: done queuing things up, now waiting for results queue to drain 42613 1727204621.90454: results queue empty 42613 1727204621.90455: checking for any_errors_fatal 42613 1727204621.90456: done checking for any_errors_fatal 42613 1727204621.90456: checking for max_fail_percentage 42613 1727204621.90457: done checking for max_fail_percentage 42613 1727204621.90458: checking to see if all hosts have failed and the running result is not ok 42613 1727204621.90459: done checking to see if all hosts have failed 42613 1727204621.90460: getting the next task for host managed-node3 42613 1727204621.90462: done getting next task for host managed-node3 42613 1727204621.90463: ^ task is: None 42613 1727204621.90464: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.90507: in VariableManager get_vars() 42613 1727204621.90524: done with get_vars() 42613 1727204621.90530: in VariableManager get_vars() 42613 1727204621.90542: done with get_vars() 42613 1727204621.90547: variable 'omit' from source: magic vars 42613 1727204621.90605: in VariableManager get_vars() 42613 1727204621.90619: done with get_vars() 42613 1727204621.90646: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 42613 1727204621.90946: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 42613 1727204621.90975: getting the remaining hosts for this loop 42613 1727204621.90977: done getting the remaining hosts for this loop 42613 1727204621.90979: getting the next task for host managed-node3 42613 1727204621.90983: done getting next task for host managed-node3 42613 1727204621.90990: ^ task is: TASK: Gathering Facts 42613 1727204621.90992: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204621.90994: getting variables 42613 1727204621.90995: in VariableManager get_vars() 42613 1727204621.91006: Calling all_inventory to load vars for managed-node3 42613 1727204621.91008: Calling groups_inventory to load vars for managed-node3 42613 1727204621.91011: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204621.91017: Calling all_plugins_play to load vars for managed-node3 42613 1727204621.91020: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204621.91023: Calling groups_plugins_play to load vars for managed-node3 42613 1727204621.92959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204621.95606: done with get_vars() 42613 1727204621.95747: done getting variables 42613 1727204621.95794: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.798) 0:00:50.566 ***** 42613 1727204621.95820: entering _queue_task() for managed-node3/gather_facts 42613 1727204621.96264: worker is 1 (out of 1 available) 42613 1727204621.96278: exiting _queue_task() for managed-node3/gather_facts 42613 1727204621.96294: done queuing things up, now waiting for results queue to drain 42613 1727204621.96296: waiting for pending results... 42613 1727204621.96675: running TaskExecutor() for managed-node3/TASK: Gathering Facts 42613 1727204621.96757: in run() - task 127b8e07-fff9-2f91-05d8-00000000066a 42613 1727204621.96782: variable 'ansible_search_path' from source: unknown 42613 1727204621.97022: calling self._execute() 42613 1727204621.97304: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.97316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.97332: variable 'omit' from source: magic vars 42613 1727204621.97823: variable 'ansible_distribution_major_version' from source: facts 42613 1727204621.97919: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204621.97928: variable 'omit' from source: magic vars 42613 1727204621.97930: variable 'omit' from source: magic vars 42613 1727204621.97951: variable 'omit' from source: magic vars 42613 1727204621.98002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204621.98060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204621.98093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204621.98117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204621.98137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204621.98183: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204621.98192: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.98201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.98332: Set connection var ansible_shell_executable to /bin/sh 42613 1727204621.98353: Set connection var ansible_pipelining to False 42613 1727204621.98462: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204621.98466: Set connection var ansible_connection to ssh 42613 1727204621.98471: Set connection var ansible_timeout to 10 42613 1727204621.98474: Set connection var ansible_shell_type to sh 42613 1727204621.98477: variable 'ansible_shell_executable' from source: unknown 42613 1727204621.98478: variable 'ansible_connection' from source: unknown 42613 1727204621.98481: variable 'ansible_module_compression' from source: unknown 42613 1727204621.98483: variable 'ansible_shell_type' from source: unknown 42613 1727204621.98485: variable 'ansible_shell_executable' from source: unknown 42613 1727204621.98487: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204621.98489: variable 'ansible_pipelining' from source: unknown 42613 1727204621.98491: variable 'ansible_timeout' from source: unknown 42613 1727204621.98494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204621.98722: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204621.98725: variable 'omit' from source: magic vars 42613 1727204621.98728: starting attempt loop 42613 1727204621.98731: running the handler 42613 1727204621.98838: variable 'ansible_facts' from source: unknown 42613 1727204621.98844: _low_level_execute_command(): starting 42613 1727204621.98846: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204621.99643: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204621.99664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204621.99727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204621.99804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204621.99850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204621.99957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204622.01923: stdout chunk (state=3): >>>/root <<< 42613 1727204622.01941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204622.02115: stderr chunk (state=3): >>><<< 42613 1727204622.02119: stdout chunk (state=3): >>><<< 42613 1727204622.02348: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204622.02353: _low_level_execute_command(): starting 42613 1727204622.02356: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657 `" && echo ansible-tmp-1727204622.0220952-46144-7564263267657="` echo /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657 `" ) && sleep 0' 42613 1727204622.03124: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204622.03130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204622.03218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204622.03326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204622.05472: stdout chunk (state=3): >>>ansible-tmp-1727204622.0220952-46144-7564263267657=/root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657 <<< 42613 1727204622.05691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204622.05695: stdout chunk (state=3): >>><<< 42613 1727204622.05697: stderr chunk (state=3): >>><<< 42613 1727204622.05716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204622.0220952-46144-7564263267657=/root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204622.05873: variable 'ansible_module_compression' from source: unknown 42613 1727204622.05877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 42613 1727204622.05903: variable 'ansible_facts' from source: unknown 42613 1727204622.06115: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/AnsiballZ_setup.py 42613 1727204622.06362: Sending initial data 42613 1727204622.06368: Sent initial data (152 bytes) 42613 1727204622.07010: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204622.07025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204622.07039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204622.07057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204622.07110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204622.07174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204622.07192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204622.07220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204622.07323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204622.09119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204622.09237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204622.09295: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpndjndf0c /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/AnsiballZ_setup.py <<< 42613 1727204622.09299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/AnsiballZ_setup.py" <<< 42613 1727204622.09401: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpndjndf0c" to remote "/root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/AnsiballZ_setup.py" <<< 42613 1727204622.11810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204622.11894: stderr chunk (state=3): >>><<< 42613 1727204622.11904: stdout chunk (state=3): >>><<< 42613 1727204622.11950: done transferring module to remote 42613 1727204622.11973: _low_level_execute_command(): starting 42613 1727204622.12027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/ /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/AnsiballZ_setup.py && sleep 0' 42613 1727204622.13547: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204622.13688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204622.13805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204622.13856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204622.13875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204622.13976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204622.16101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204622.16343: stderr chunk (state=3): >>><<< 42613 1727204622.16347: stdout chunk (state=3): >>><<< 42613 1727204622.16582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204622.16586: _low_level_execute_command(): starting 42613 1727204622.16589: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/AnsiballZ_setup.py && sleep 0' 42613 1727204622.18004: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204622.18261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204622.18455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204622.18459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204622.88026: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "42", "epoch": "1727204622", "epoch_int": "1727204622", "date": "2024-09-24", "time": "15:03:42", "iso8601_micro": "2024-09-24T19:03:42.495020Z", "iso8601": "2024-09-24T19:03:42Z", "iso8601_basic": "20240924T150342495020", "iso8601_basic_short": "20240924T150342", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3040, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 676, "free": 3040}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 959, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303542784, "block_size": 4096, "block_total": 64479564, "block_available": 61353404, "block_used": 3126160, "inode_total": 16384000, "inode_available": 16301442, "inode_used": 82558, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_loadavg": {"1m": 0.478515625, "5m": 0.59912109375, "15m": 0.41943359375}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 42613 1727204622.90088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204622.90197: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 42613 1727204622.90233: stdout chunk (state=3): >>><<< 42613 1727204622.90236: stderr chunk (state=3): >>><<< 42613 1727204622.90503: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "42", "epoch": "1727204622", "epoch_int": "1727204622", "date": "2024-09-24", "time": "15:03:42", "iso8601_micro": "2024-09-24T19:03:42.495020Z", "iso8601": "2024-09-24T19:03:42Z", "iso8601_basic": "20240924T150342495020", "iso8601_basic_short": "20240924T150342", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3040, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 676, "free": 3040}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 959, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251303542784, "block_size": 4096, "block_total": 64479564, "block_available": 61353404, "block_used": 3126160, "inode_total": 16384000, "inode_available": 16301442, "inode_used": 82558, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_loadavg": {"1m": 0.478515625, "5m": 0.59912109375, "15m": 0.41943359375}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204622.90847: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204622.90902: _low_level_execute_command(): starting 42613 1727204622.90912: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204622.0220952-46144-7564263267657/ > /dev/null 2>&1 && sleep 0' 42613 1727204622.91687: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204622.91764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204622.91789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204622.91813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204622.91984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204622.94373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204622.94378: stdout chunk (state=3): >>><<< 42613 1727204622.94381: stderr chunk (state=3): >>><<< 42613 1727204622.94384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204622.94386: handler run complete 42613 1727204622.94587: variable 'ansible_facts' from source: unknown 42613 1727204622.94722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204622.95462: variable 'ansible_facts' from source: unknown 42613 1727204622.95594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204622.95790: attempt loop complete, returning result 42613 1727204622.95800: _execute() done 42613 1727204622.95806: dumping result to json 42613 1727204622.95846: done dumping result, returning 42613 1727204622.95867: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-2f91-05d8-00000000066a] 42613 1727204622.95885: sending task result for task 127b8e07-fff9-2f91-05d8-00000000066a 42613 1727204622.96645: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000066a 42613 1727204622.96649: WORKER PROCESS EXITING ok: [managed-node3] 42613 1727204622.97123: no more pending results, returning what we have 42613 1727204622.97127: results queue empty 42613 1727204622.97128: checking for any_errors_fatal 42613 1727204622.97129: done checking for any_errors_fatal 42613 1727204622.97130: checking for max_fail_percentage 42613 1727204622.97131: done checking for max_fail_percentage 42613 1727204622.97132: checking to see if all hosts have failed and the running result is not ok 42613 1727204622.97133: done checking to see if all hosts have failed 42613 1727204622.97134: getting the remaining hosts for this loop 42613 1727204622.97135: done getting the remaining hosts for this loop 42613 1727204622.97142: getting the next task for host managed-node3 42613 1727204622.97147: done getting next task for host managed-node3 42613 1727204622.97149: ^ task is: TASK: meta (flush_handlers) 42613 1727204622.97151: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204622.97155: getting variables 42613 1727204622.97157: in VariableManager get_vars() 42613 1727204622.97187: Calling all_inventory to load vars for managed-node3 42613 1727204622.97190: Calling groups_inventory to load vars for managed-node3 42613 1727204622.97193: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204622.97205: Calling all_plugins_play to load vars for managed-node3 42613 1727204622.97208: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204622.97212: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.01497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.07228: done with get_vars() 42613 1727204623.07277: done getting variables 42613 1727204623.07361: in VariableManager get_vars() 42613 1727204623.07787: Calling all_inventory to load vars for managed-node3 42613 1727204623.07791: Calling groups_inventory to load vars for managed-node3 42613 1727204623.07794: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204623.07801: Calling all_plugins_play to load vars for managed-node3 42613 1727204623.07804: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204623.07808: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.10720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.14971: done with get_vars() 42613 1727204623.15024: done queuing things up, now waiting for results queue to drain 42613 1727204623.15026: results queue empty 42613 1727204623.15027: checking for any_errors_fatal 42613 1727204623.15033: done checking for any_errors_fatal 42613 1727204623.15034: checking for max_fail_percentage 42613 1727204623.15035: done checking for max_fail_percentage 42613 1727204623.15036: checking to see if all hosts have failed and the running result is not ok 42613 1727204623.15036: done checking to see if all hosts have failed 42613 1727204623.15045: getting the remaining hosts for this loop 42613 1727204623.15046: done getting the remaining hosts for this loop 42613 1727204623.15050: getting the next task for host managed-node3 42613 1727204623.15055: done getting next task for host managed-node3 42613 1727204623.15058: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 42613 1727204623.15060: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204623.15063: getting variables 42613 1727204623.15064: in VariableManager get_vars() 42613 1727204623.15173: Calling all_inventory to load vars for managed-node3 42613 1727204623.15176: Calling groups_inventory to load vars for managed-node3 42613 1727204623.15179: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204623.15186: Calling all_plugins_play to load vars for managed-node3 42613 1727204623.15189: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204623.15192: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.19235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.24291: done with get_vars() 42613 1727204623.24386: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:230 Tuesday 24 September 2024 15:03:43 -0400 (0:00:01.287) 0:00:51.854 ***** 42613 1727204623.24598: entering _queue_task() for managed-node3/include_tasks 42613 1727204623.26239: worker is 1 (out of 1 available) 42613 1727204623.26259: exiting _queue_task() for managed-node3/include_tasks 42613 1727204623.26276: done queuing things up, now waiting for results queue to drain 42613 1727204623.26278: waiting for pending results... 42613 1727204623.27207: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' 42613 1727204623.27487: in run() - task 127b8e07-fff9-2f91-05d8-0000000000a9 42613 1727204623.27492: variable 'ansible_search_path' from source: unknown 42613 1727204623.27495: calling self._execute() 42613 1727204623.27528: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.27541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.27564: variable 'omit' from source: magic vars 42613 1727204623.28024: variable 'ansible_distribution_major_version' from source: facts 42613 1727204623.28048: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204623.28060: _execute() done 42613 1727204623.28070: dumping result to json 42613 1727204623.28079: done dumping result, returning 42613 1727204623.28090: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' [127b8e07-fff9-2f91-05d8-0000000000a9] 42613 1727204623.28106: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a9 42613 1727204623.28270: no more pending results, returning what we have 42613 1727204623.28276: in VariableManager get_vars() 42613 1727204623.28314: Calling all_inventory to load vars for managed-node3 42613 1727204623.28317: Calling groups_inventory to load vars for managed-node3 42613 1727204623.28321: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204623.28338: Calling all_plugins_play to load vars for managed-node3 42613 1727204623.28344: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204623.28347: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.29086: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000a9 42613 1727204623.29091: WORKER PROCESS EXITING 42613 1727204623.32307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.35256: done with get_vars() 42613 1727204623.35499: variable 'ansible_search_path' from source: unknown 42613 1727204623.35519: we have included files to process 42613 1727204623.35520: generating all_blocks data 42613 1727204623.35522: done generating all_blocks data 42613 1727204623.35523: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 42613 1727204623.35524: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 42613 1727204623.35526: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 42613 1727204623.35912: in VariableManager get_vars() 42613 1727204623.35932: done with get_vars() 42613 1727204623.36274: done processing included file 42613 1727204623.36277: iterating over new_blocks loaded from include file 42613 1727204623.36279: in VariableManager get_vars() 42613 1727204623.36297: done with get_vars() 42613 1727204623.36298: filtering new block on tags 42613 1727204623.36320: done filtering new block on tags 42613 1727204623.36323: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node3 42613 1727204623.36329: extending task lists for all hosts with included blocks 42613 1727204623.36413: done extending task lists 42613 1727204623.36414: done processing included files 42613 1727204623.36415: results queue empty 42613 1727204623.36416: checking for any_errors_fatal 42613 1727204623.36418: done checking for any_errors_fatal 42613 1727204623.36419: checking for max_fail_percentage 42613 1727204623.36420: done checking for max_fail_percentage 42613 1727204623.36421: checking to see if all hosts have failed and the running result is not ok 42613 1727204623.36423: done checking to see if all hosts have failed 42613 1727204623.36424: getting the remaining hosts for this loop 42613 1727204623.36425: done getting the remaining hosts for this loop 42613 1727204623.36428: getting the next task for host managed-node3 42613 1727204623.36432: done getting next task for host managed-node3 42613 1727204623.36435: ^ task is: TASK: Include the task 'get_profile_stat.yml' 42613 1727204623.36437: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204623.36442: getting variables 42613 1727204623.36443: in VariableManager get_vars() 42613 1727204623.36455: Calling all_inventory to load vars for managed-node3 42613 1727204623.36458: Calling groups_inventory to load vars for managed-node3 42613 1727204623.36461: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204623.36672: Calling all_plugins_play to load vars for managed-node3 42613 1727204623.36676: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204623.36681: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.39002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.42798: done with get_vars() 42613 1727204623.42843: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:03:43 -0400 (0:00:00.183) 0:00:52.037 ***** 42613 1727204623.42936: entering _queue_task() for managed-node3/include_tasks 42613 1727204623.43368: worker is 1 (out of 1 available) 42613 1727204623.43384: exiting _queue_task() for managed-node3/include_tasks 42613 1727204623.43398: done queuing things up, now waiting for results queue to drain 42613 1727204623.43399: waiting for pending results... 42613 1727204623.44059: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 42613 1727204623.44326: in run() - task 127b8e07-fff9-2f91-05d8-00000000067b 42613 1727204623.44356: variable 'ansible_search_path' from source: unknown 42613 1727204623.44368: variable 'ansible_search_path' from source: unknown 42613 1727204623.44504: calling self._execute() 42613 1727204623.44665: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.44730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.44938: variable 'omit' from source: magic vars 42613 1727204623.45697: variable 'ansible_distribution_major_version' from source: facts 42613 1727204623.45719: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204623.45731: _execute() done 42613 1727204623.45739: dumping result to json 42613 1727204623.45924: done dumping result, returning 42613 1727204623.45928: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-2f91-05d8-00000000067b] 42613 1727204623.45932: sending task result for task 127b8e07-fff9-2f91-05d8-00000000067b 42613 1727204623.46019: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000067b 42613 1727204623.46269: WORKER PROCESS EXITING 42613 1727204623.46303: no more pending results, returning what we have 42613 1727204623.46309: in VariableManager get_vars() 42613 1727204623.46355: Calling all_inventory to load vars for managed-node3 42613 1727204623.46359: Calling groups_inventory to load vars for managed-node3 42613 1727204623.46363: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204623.46384: Calling all_plugins_play to load vars for managed-node3 42613 1727204623.46388: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204623.46391: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.49018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.53737: done with get_vars() 42613 1727204623.53876: variable 'ansible_search_path' from source: unknown 42613 1727204623.53878: variable 'ansible_search_path' from source: unknown 42613 1727204623.53997: we have included files to process 42613 1727204623.53998: generating all_blocks data 42613 1727204623.54000: done generating all_blocks data 42613 1727204623.54002: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 42613 1727204623.54003: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 42613 1727204623.54006: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 42613 1727204623.56939: done processing included file 42613 1727204623.56944: iterating over new_blocks loaded from include file 42613 1727204623.56946: in VariableManager get_vars() 42613 1727204623.56972: done with get_vars() 42613 1727204623.56975: filtering new block on tags 42613 1727204623.57004: done filtering new block on tags 42613 1727204623.57008: in VariableManager get_vars() 42613 1727204623.57024: done with get_vars() 42613 1727204623.57025: filtering new block on tags 42613 1727204623.57053: done filtering new block on tags 42613 1727204623.57056: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 42613 1727204623.57063: extending task lists for all hosts with included blocks 42613 1727204623.57195: done extending task lists 42613 1727204623.57197: done processing included files 42613 1727204623.57197: results queue empty 42613 1727204623.57198: checking for any_errors_fatal 42613 1727204623.57202: done checking for any_errors_fatal 42613 1727204623.57203: checking for max_fail_percentage 42613 1727204623.57204: done checking for max_fail_percentage 42613 1727204623.57205: checking to see if all hosts have failed and the running result is not ok 42613 1727204623.57206: done checking to see if all hosts have failed 42613 1727204623.57206: getting the remaining hosts for this loop 42613 1727204623.57208: done getting the remaining hosts for this loop 42613 1727204623.57210: getting the next task for host managed-node3 42613 1727204623.57214: done getting next task for host managed-node3 42613 1727204623.57217: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 42613 1727204623.57220: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204623.57223: getting variables 42613 1727204623.57225: in VariableManager get_vars() 42613 1727204623.57369: Calling all_inventory to load vars for managed-node3 42613 1727204623.57373: Calling groups_inventory to load vars for managed-node3 42613 1727204623.57376: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204623.57382: Calling all_plugins_play to load vars for managed-node3 42613 1727204623.57385: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204623.57388: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.59244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.72747: done with get_vars() 42613 1727204623.72844: done getting variables 42613 1727204623.73102: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:03:43 -0400 (0:00:00.301) 0:00:52.339 ***** 42613 1727204623.73134: entering _queue_task() for managed-node3/set_fact 42613 1727204623.73702: worker is 1 (out of 1 available) 42613 1727204623.73716: exiting _queue_task() for managed-node3/set_fact 42613 1727204623.73731: done queuing things up, now waiting for results queue to drain 42613 1727204623.73733: waiting for pending results... 42613 1727204623.74053: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 42613 1727204623.74235: in run() - task 127b8e07-fff9-2f91-05d8-00000000068a 42613 1727204623.74389: variable 'ansible_search_path' from source: unknown 42613 1727204623.74393: variable 'ansible_search_path' from source: unknown 42613 1727204623.74397: calling self._execute() 42613 1727204623.74487: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.74496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.74502: variable 'omit' from source: magic vars 42613 1727204623.74855: variable 'ansible_distribution_major_version' from source: facts 42613 1727204623.74867: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204623.74874: variable 'omit' from source: magic vars 42613 1727204623.74919: variable 'omit' from source: magic vars 42613 1727204623.74950: variable 'omit' from source: magic vars 42613 1727204623.74991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204623.75024: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204623.75046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204623.75060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204623.75072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204623.75099: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204623.75104: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.75107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.75217: Set connection var ansible_shell_executable to /bin/sh 42613 1727204623.75226: Set connection var ansible_pipelining to False 42613 1727204623.75238: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204623.75244: Set connection var ansible_connection to ssh 42613 1727204623.75248: Set connection var ansible_timeout to 10 42613 1727204623.75251: Set connection var ansible_shell_type to sh 42613 1727204623.75424: variable 'ansible_shell_executable' from source: unknown 42613 1727204623.75427: variable 'ansible_connection' from source: unknown 42613 1727204623.75430: variable 'ansible_module_compression' from source: unknown 42613 1727204623.75433: variable 'ansible_shell_type' from source: unknown 42613 1727204623.75435: variable 'ansible_shell_executable' from source: unknown 42613 1727204623.75438: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.75440: variable 'ansible_pipelining' from source: unknown 42613 1727204623.75442: variable 'ansible_timeout' from source: unknown 42613 1727204623.75445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.75572: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204623.75576: variable 'omit' from source: magic vars 42613 1727204623.75579: starting attempt loop 42613 1727204623.75582: running the handler 42613 1727204623.75586: handler run complete 42613 1727204623.75590: attempt loop complete, returning result 42613 1727204623.75592: _execute() done 42613 1727204623.75596: dumping result to json 42613 1727204623.75598: done dumping result, returning 42613 1727204623.75601: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-2f91-05d8-00000000068a] 42613 1727204623.75605: sending task result for task 127b8e07-fff9-2f91-05d8-00000000068a ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 42613 1727204623.75757: no more pending results, returning what we have 42613 1727204623.75760: results queue empty 42613 1727204623.75761: checking for any_errors_fatal 42613 1727204623.75762: done checking for any_errors_fatal 42613 1727204623.75763: checking for max_fail_percentage 42613 1727204623.75768: done checking for max_fail_percentage 42613 1727204623.75769: checking to see if all hosts have failed and the running result is not ok 42613 1727204623.75770: done checking to see if all hosts have failed 42613 1727204623.75770: getting the remaining hosts for this loop 42613 1727204623.75772: done getting the remaining hosts for this loop 42613 1727204623.75777: getting the next task for host managed-node3 42613 1727204623.75787: done getting next task for host managed-node3 42613 1727204623.75789: ^ task is: TASK: Stat profile file 42613 1727204623.75793: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204623.75799: getting variables 42613 1727204623.75801: in VariableManager get_vars() 42613 1727204623.75836: Calling all_inventory to load vars for managed-node3 42613 1727204623.75838: Calling groups_inventory to load vars for managed-node3 42613 1727204623.75842: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204623.75855: Calling all_plugins_play to load vars for managed-node3 42613 1727204623.75859: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204623.75861: Calling groups_plugins_play to load vars for managed-node3 42613 1727204623.76611: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000068a 42613 1727204623.76675: WORKER PROCESS EXITING 42613 1727204623.78017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204623.80131: done with get_vars() 42613 1727204623.80178: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:03:43 -0400 (0:00:00.071) 0:00:52.411 ***** 42613 1727204623.80298: entering _queue_task() for managed-node3/stat 42613 1727204623.80826: worker is 1 (out of 1 available) 42613 1727204623.80839: exiting _queue_task() for managed-node3/stat 42613 1727204623.80852: done queuing things up, now waiting for results queue to drain 42613 1727204623.80853: waiting for pending results... 42613 1727204623.81152: running TaskExecutor() for managed-node3/TASK: Stat profile file 42613 1727204623.81320: in run() - task 127b8e07-fff9-2f91-05d8-00000000068b 42613 1727204623.81350: variable 'ansible_search_path' from source: unknown 42613 1727204623.81358: variable 'ansible_search_path' from source: unknown 42613 1727204623.81416: calling self._execute() 42613 1727204623.81543: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.81557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.81600: variable 'omit' from source: magic vars 42613 1727204623.82038: variable 'ansible_distribution_major_version' from source: facts 42613 1727204623.82121: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204623.82126: variable 'omit' from source: magic vars 42613 1727204623.82129: variable 'omit' from source: magic vars 42613 1727204623.82241: variable 'profile' from source: include params 42613 1727204623.82249: variable 'interface' from source: set_fact 42613 1727204623.82309: variable 'interface' from source: set_fact 42613 1727204623.82325: variable 'omit' from source: magic vars 42613 1727204623.82368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204623.82401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204623.82421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204623.82438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204623.82453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204623.82482: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204623.82486: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.82489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.82583: Set connection var ansible_shell_executable to /bin/sh 42613 1727204623.82587: Set connection var ansible_pipelining to False 42613 1727204623.82593: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204623.82598: Set connection var ansible_connection to ssh 42613 1727204623.82600: Set connection var ansible_timeout to 10 42613 1727204623.82603: Set connection var ansible_shell_type to sh 42613 1727204623.82625: variable 'ansible_shell_executable' from source: unknown 42613 1727204623.82628: variable 'ansible_connection' from source: unknown 42613 1727204623.82630: variable 'ansible_module_compression' from source: unknown 42613 1727204623.82634: variable 'ansible_shell_type' from source: unknown 42613 1727204623.82637: variable 'ansible_shell_executable' from source: unknown 42613 1727204623.82639: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204623.82641: variable 'ansible_pipelining' from source: unknown 42613 1727204623.82649: variable 'ansible_timeout' from source: unknown 42613 1727204623.82654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204623.82906: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204623.82943: variable 'omit' from source: magic vars 42613 1727204623.82947: starting attempt loop 42613 1727204623.82955: running the handler 42613 1727204623.82959: _low_level_execute_command(): starting 42613 1727204623.82961: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204623.83746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204623.83772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204623.83790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204623.83808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204623.83828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204623.83937: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204623.83960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204623.84077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204623.85892: stdout chunk (state=3): >>>/root <<< 42613 1727204623.86030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204623.86067: stderr chunk (state=3): >>><<< 42613 1727204623.86071: stdout chunk (state=3): >>><<< 42613 1727204623.86096: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204623.86111: _low_level_execute_command(): starting 42613 1727204623.86172: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749 `" && echo ansible-tmp-1727204623.8609724-46218-120366178567749="` echo /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749 `" ) && sleep 0' 42613 1727204623.86630: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204623.86634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204623.86650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204623.86654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204623.86701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204623.86704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204623.86708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204623.86787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204623.88928: stdout chunk (state=3): >>>ansible-tmp-1727204623.8609724-46218-120366178567749=/root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749 <<< 42613 1727204623.89042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204623.89109: stderr chunk (state=3): >>><<< 42613 1727204623.89112: stdout chunk (state=3): >>><<< 42613 1727204623.89131: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204623.8609724-46218-120366178567749=/root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204623.89183: variable 'ansible_module_compression' from source: unknown 42613 1727204623.89232: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 42613 1727204623.89270: variable 'ansible_facts' from source: unknown 42613 1727204623.89325: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/AnsiballZ_stat.py 42613 1727204623.89444: Sending initial data 42613 1727204623.89447: Sent initial data (153 bytes) 42613 1727204623.89955: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204623.89959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 42613 1727204623.89962: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204623.90018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204623.90021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204623.90024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204623.90102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204623.91885: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204623.91949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204623.92019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmphsc0zjs5 /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/AnsiballZ_stat.py <<< 42613 1727204623.92027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/AnsiballZ_stat.py" <<< 42613 1727204623.92085: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmphsc0zjs5" to remote "/root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/AnsiballZ_stat.py" <<< 42613 1727204623.92087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/AnsiballZ_stat.py" <<< 42613 1727204623.92750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204623.92834: stderr chunk (state=3): >>><<< 42613 1727204623.92837: stdout chunk (state=3): >>><<< 42613 1727204623.92862: done transferring module to remote 42613 1727204623.92878: _low_level_execute_command(): starting 42613 1727204623.92882: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/ /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/AnsiballZ_stat.py && sleep 0' 42613 1727204623.93373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204623.93378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204623.93393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204623.93396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204623.93455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204623.93460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204623.93530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204623.95528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204623.95597: stderr chunk (state=3): >>><<< 42613 1727204623.95601: stdout chunk (state=3): >>><<< 42613 1727204623.95615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204623.95618: _low_level_execute_command(): starting 42613 1727204623.95625: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/AnsiballZ_stat.py && sleep 0' 42613 1727204623.96134: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204623.96138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204623.96141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204623.96202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204623.96213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204623.96216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204623.96291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.13818: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 42613 1727204624.15399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204624.15507: stderr chunk (state=3): >>><<< 42613 1727204624.15511: stdout chunk (state=3): >>><<< 42613 1727204624.15657: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204624.15661: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204624.15664: _low_level_execute_command(): starting 42613 1727204624.15668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204623.8609724-46218-120366178567749/ > /dev/null 2>&1 && sleep 0' 42613 1727204624.16305: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204624.16325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204624.16395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 42613 1727204624.16407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204624.16495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204624.16564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204624.16626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.18748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204624.18753: stdout chunk (state=3): >>><<< 42613 1727204624.18756: stderr chunk (state=3): >>><<< 42613 1727204624.18974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204624.18979: handler run complete 42613 1727204624.18982: attempt loop complete, returning result 42613 1727204624.18985: _execute() done 42613 1727204624.18987: dumping result to json 42613 1727204624.18990: done dumping result, returning 42613 1727204624.18992: done running TaskExecutor() for managed-node3/TASK: Stat profile file [127b8e07-fff9-2f91-05d8-00000000068b] 42613 1727204624.18995: sending task result for task 127b8e07-fff9-2f91-05d8-00000000068b 42613 1727204624.19078: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000068b 42613 1727204624.19081: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 42613 1727204624.19170: no more pending results, returning what we have 42613 1727204624.19173: results queue empty 42613 1727204624.19175: checking for any_errors_fatal 42613 1727204624.19182: done checking for any_errors_fatal 42613 1727204624.19183: checking for max_fail_percentage 42613 1727204624.19185: done checking for max_fail_percentage 42613 1727204624.19185: checking to see if all hosts have failed and the running result is not ok 42613 1727204624.19186: done checking to see if all hosts have failed 42613 1727204624.19187: getting the remaining hosts for this loop 42613 1727204624.19189: done getting the remaining hosts for this loop 42613 1727204624.19193: getting the next task for host managed-node3 42613 1727204624.19203: done getting next task for host managed-node3 42613 1727204624.19206: ^ task is: TASK: Set NM profile exist flag based on the profile files 42613 1727204624.19210: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204624.19214: getting variables 42613 1727204624.19215: in VariableManager get_vars() 42613 1727204624.19248: Calling all_inventory to load vars for managed-node3 42613 1727204624.19251: Calling groups_inventory to load vars for managed-node3 42613 1727204624.19254: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204624.19442: Calling all_plugins_play to load vars for managed-node3 42613 1727204624.19447: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204624.19452: Calling groups_plugins_play to load vars for managed-node3 42613 1727204624.21467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204624.23830: done with get_vars() 42613 1727204624.23873: done getting variables 42613 1727204624.23950: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.436) 0:00:52.848 ***** 42613 1727204624.23988: entering _queue_task() for managed-node3/set_fact 42613 1727204624.24514: worker is 1 (out of 1 available) 42613 1727204624.24527: exiting _queue_task() for managed-node3/set_fact 42613 1727204624.24540: done queuing things up, now waiting for results queue to drain 42613 1727204624.24541: waiting for pending results... 42613 1727204624.24760: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 42613 1727204624.24900: in run() - task 127b8e07-fff9-2f91-05d8-00000000068c 42613 1727204624.24912: variable 'ansible_search_path' from source: unknown 42613 1727204624.24916: variable 'ansible_search_path' from source: unknown 42613 1727204624.24957: calling self._execute() 42613 1727204624.25273: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204624.25277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204624.25281: variable 'omit' from source: magic vars 42613 1727204624.25537: variable 'ansible_distribution_major_version' from source: facts 42613 1727204624.25550: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204624.25700: variable 'profile_stat' from source: set_fact 42613 1727204624.25715: Evaluated conditional (profile_stat.stat.exists): False 42613 1727204624.25718: when evaluation is False, skipping this task 42613 1727204624.25721: _execute() done 42613 1727204624.25724: dumping result to json 42613 1727204624.25727: done dumping result, returning 42613 1727204624.25736: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-2f91-05d8-00000000068c] 42613 1727204624.25750: sending task result for task 127b8e07-fff9-2f91-05d8-00000000068c 42613 1727204624.25859: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000068c 42613 1727204624.25864: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 42613 1727204624.25920: no more pending results, returning what we have 42613 1727204624.25924: results queue empty 42613 1727204624.25926: checking for any_errors_fatal 42613 1727204624.25938: done checking for any_errors_fatal 42613 1727204624.25939: checking for max_fail_percentage 42613 1727204624.25941: done checking for max_fail_percentage 42613 1727204624.25943: checking to see if all hosts have failed and the running result is not ok 42613 1727204624.25944: done checking to see if all hosts have failed 42613 1727204624.25944: getting the remaining hosts for this loop 42613 1727204624.25946: done getting the remaining hosts for this loop 42613 1727204624.25951: getting the next task for host managed-node3 42613 1727204624.25962: done getting next task for host managed-node3 42613 1727204624.25966: ^ task is: TASK: Get NM profile info 42613 1727204624.25972: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204624.25977: getting variables 42613 1727204624.25979: in VariableManager get_vars() 42613 1727204624.26014: Calling all_inventory to load vars for managed-node3 42613 1727204624.26017: Calling groups_inventory to load vars for managed-node3 42613 1727204624.26022: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204624.26040: Calling all_plugins_play to load vars for managed-node3 42613 1727204624.26044: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204624.26048: Calling groups_plugins_play to load vars for managed-node3 42613 1727204624.28469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204624.31650: done with get_vars() 42613 1727204624.31899: done getting variables 42613 1727204624.32012: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.080) 0:00:52.928 ***** 42613 1727204624.32050: entering _queue_task() for managed-node3/shell 42613 1727204624.32052: Creating lock for shell 42613 1727204624.33110: worker is 1 (out of 1 available) 42613 1727204624.33126: exiting _queue_task() for managed-node3/shell 42613 1727204624.33138: done queuing things up, now waiting for results queue to drain 42613 1727204624.33139: waiting for pending results... 42613 1727204624.33787: running TaskExecutor() for managed-node3/TASK: Get NM profile info 42613 1727204624.33795: in run() - task 127b8e07-fff9-2f91-05d8-00000000068d 42613 1727204624.33805: variable 'ansible_search_path' from source: unknown 42613 1727204624.33809: variable 'ansible_search_path' from source: unknown 42613 1727204624.33896: calling self._execute() 42613 1727204624.34126: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204624.34133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204624.34143: variable 'omit' from source: magic vars 42613 1727204624.35037: variable 'ansible_distribution_major_version' from source: facts 42613 1727204624.35073: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204624.35077: variable 'omit' from source: magic vars 42613 1727204624.35258: variable 'omit' from source: magic vars 42613 1727204624.35405: variable 'profile' from source: include params 42613 1727204624.35416: variable 'interface' from source: set_fact 42613 1727204624.35508: variable 'interface' from source: set_fact 42613 1727204624.35534: variable 'omit' from source: magic vars 42613 1727204624.35598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204624.35646: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204624.35680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204624.35705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204624.35772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204624.35776: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204624.35779: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204624.35781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204624.35905: Set connection var ansible_shell_executable to /bin/sh 42613 1727204624.35916: Set connection var ansible_pipelining to False 42613 1727204624.35929: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204624.35937: Set connection var ansible_connection to ssh 42613 1727204624.35950: Set connection var ansible_timeout to 10 42613 1727204624.35958: Set connection var ansible_shell_type to sh 42613 1727204624.35994: variable 'ansible_shell_executable' from source: unknown 42613 1727204624.36002: variable 'ansible_connection' from source: unknown 42613 1727204624.36071: variable 'ansible_module_compression' from source: unknown 42613 1727204624.36074: variable 'ansible_shell_type' from source: unknown 42613 1727204624.36077: variable 'ansible_shell_executable' from source: unknown 42613 1727204624.36079: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204624.36082: variable 'ansible_pipelining' from source: unknown 42613 1727204624.36084: variable 'ansible_timeout' from source: unknown 42613 1727204624.36086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204624.36219: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204624.36238: variable 'omit' from source: magic vars 42613 1727204624.36253: starting attempt loop 42613 1727204624.36260: running the handler 42613 1727204624.36278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204624.36307: _low_level_execute_command(): starting 42613 1727204624.36325: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204624.37162: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204624.37188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204624.37207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204624.37318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204624.37336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204624.37359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204624.37375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204624.37493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.39318: stdout chunk (state=3): >>>/root <<< 42613 1727204624.39519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204624.39554: stdout chunk (state=3): >>><<< 42613 1727204624.39558: stderr chunk (state=3): >>><<< 42613 1727204624.39585: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204624.39606: _low_level_execute_command(): starting 42613 1727204624.39703: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110 `" && echo ansible-tmp-1727204624.3959312-46239-95344394067110="` echo /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110 `" ) && sleep 0' 42613 1727204624.40372: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204624.40498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204624.40502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204624.40634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.42820: stdout chunk (state=3): >>>ansible-tmp-1727204624.3959312-46239-95344394067110=/root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110 <<< 42613 1727204624.43175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204624.43180: stdout chunk (state=3): >>><<< 42613 1727204624.43182: stderr chunk (state=3): >>><<< 42613 1727204624.43185: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204624.3959312-46239-95344394067110=/root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204624.43188: variable 'ansible_module_compression' from source: unknown 42613 1727204624.43191: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204624.43244: variable 'ansible_facts' from source: unknown 42613 1727204624.43338: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/AnsiballZ_command.py 42613 1727204624.43872: Sending initial data 42613 1727204624.43875: Sent initial data (155 bytes) 42613 1727204624.44230: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204624.44281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204624.44345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204624.44349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204624.44395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204624.44463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.46264: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204624.46345: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204624.46411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpmvdotagy /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/AnsiballZ_command.py <<< 42613 1727204624.46414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/AnsiballZ_command.py" <<< 42613 1727204624.46507: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpmvdotagy" to remote "/root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/AnsiballZ_command.py" <<< 42613 1727204624.47464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204624.47506: stderr chunk (state=3): >>><<< 42613 1727204624.47519: stdout chunk (state=3): >>><<< 42613 1727204624.47557: done transferring module to remote 42613 1727204624.47586: _low_level_execute_command(): starting 42613 1727204624.47597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/ /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/AnsiballZ_command.py && sleep 0' 42613 1727204624.48324: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204624.48349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204624.48368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204624.48387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204624.48404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204624.48445: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204624.48464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204624.48558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204624.48587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204624.48691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.50820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204624.50832: stdout chunk (state=3): >>><<< 42613 1727204624.50851: stderr chunk (state=3): >>><<< 42613 1727204624.50876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204624.50884: _low_level_execute_command(): starting 42613 1727204624.50894: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/AnsiballZ_command.py && sleep 0' 42613 1727204624.51696: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204624.51729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204624.51752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204624.51781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204624.51899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.71551: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:03:44.693456", "end": "2024-09-24 15:03:44.712925", "delta": "0:00:00.019469", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204624.73630: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.169 closed. <<< 42613 1727204624.73636: stdout chunk (state=3): >>><<< 42613 1727204624.73638: stderr chunk (state=3): >>><<< 42613 1727204624.73644: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:03:44.693456", "end": "2024-09-24 15:03:44.712925", "delta": "0:00:00.019469", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.169 closed. 42613 1727204624.73646: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204624.73649: _low_level_execute_command(): starting 42613 1727204624.73651: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204624.3959312-46239-95344394067110/ > /dev/null 2>&1 && sleep 0' 42613 1727204624.74775: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204624.74793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204624.74991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204624.75063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204624.75122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204624.77268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204624.77401: stderr chunk (state=3): >>><<< 42613 1727204624.77411: stdout chunk (state=3): >>><<< 42613 1727204624.77444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204624.77462: handler run complete 42613 1727204624.77508: Evaluated conditional (False): False 42613 1727204624.77542: attempt loop complete, returning result 42613 1727204624.77551: _execute() done 42613 1727204624.77628: dumping result to json 42613 1727204624.77631: done dumping result, returning 42613 1727204624.77634: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [127b8e07-fff9-2f91-05d8-00000000068d] 42613 1727204624.77639: sending task result for task 127b8e07-fff9-2f91-05d8-00000000068d 42613 1727204624.77730: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000068d fatal: [managed-node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.019469", "end": "2024-09-24 15:03:44.712925", "rc": 1, "start": "2024-09-24 15:03:44.693456" } MSG: non-zero return code ...ignoring 42613 1727204624.77839: no more pending results, returning what we have 42613 1727204624.77848: results queue empty 42613 1727204624.77849: checking for any_errors_fatal 42613 1727204624.77859: done checking for any_errors_fatal 42613 1727204624.77859: checking for max_fail_percentage 42613 1727204624.77862: done checking for max_fail_percentage 42613 1727204624.77863: checking to see if all hosts have failed and the running result is not ok 42613 1727204624.77864: done checking to see if all hosts have failed 42613 1727204624.77865: getting the remaining hosts for this loop 42613 1727204624.77869: done getting the remaining hosts for this loop 42613 1727204624.77875: getting the next task for host managed-node3 42613 1727204624.77883: done getting next task for host managed-node3 42613 1727204624.77887: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 42613 1727204624.77891: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204624.77896: getting variables 42613 1727204624.77898: in VariableManager get_vars() 42613 1727204624.77933: Calling all_inventory to load vars for managed-node3 42613 1727204624.77936: Calling groups_inventory to load vars for managed-node3 42613 1727204624.77942: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204624.77957: Calling all_plugins_play to load vars for managed-node3 42613 1727204624.77961: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204624.78191: Calling groups_plugins_play to load vars for managed-node3 42613 1727204624.78207: WORKER PROCESS EXITING 42613 1727204624.80772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204624.83828: done with get_vars() 42613 1727204624.83873: done getting variables 42613 1727204624.83947: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.519) 0:00:53.447 ***** 42613 1727204624.83985: entering _queue_task() for managed-node3/set_fact 42613 1727204624.84693: worker is 1 (out of 1 available) 42613 1727204624.84707: exiting _queue_task() for managed-node3/set_fact 42613 1727204624.84719: done queuing things up, now waiting for results queue to drain 42613 1727204624.84721: waiting for pending results... 42613 1727204624.84892: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 42613 1727204624.85039: in run() - task 127b8e07-fff9-2f91-05d8-00000000068e 42613 1727204624.85073: variable 'ansible_search_path' from source: unknown 42613 1727204624.85082: variable 'ansible_search_path' from source: unknown 42613 1727204624.85127: calling self._execute() 42613 1727204624.85246: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204624.85260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204624.85283: variable 'omit' from source: magic vars 42613 1727204624.85753: variable 'ansible_distribution_major_version' from source: facts 42613 1727204624.85798: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204624.85984: variable 'nm_profile_exists' from source: set_fact 42613 1727204624.86010: Evaluated conditional (nm_profile_exists.rc == 0): False 42613 1727204624.86018: when evaluation is False, skipping this task 42613 1727204624.86028: _execute() done 42613 1727204624.86044: dumping result to json 42613 1727204624.86054: done dumping result, returning 42613 1727204624.86070: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-2f91-05d8-00000000068e] 42613 1727204624.86082: sending task result for task 127b8e07-fff9-2f91-05d8-00000000068e 42613 1727204624.86351: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000068e 42613 1727204624.86356: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 42613 1727204624.86432: no more pending results, returning what we have 42613 1727204624.86437: results queue empty 42613 1727204624.86438: checking for any_errors_fatal 42613 1727204624.86450: done checking for any_errors_fatal 42613 1727204624.86452: checking for max_fail_percentage 42613 1727204624.86454: done checking for max_fail_percentage 42613 1727204624.86455: checking to see if all hosts have failed and the running result is not ok 42613 1727204624.86456: done checking to see if all hosts have failed 42613 1727204624.86457: getting the remaining hosts for this loop 42613 1727204624.86459: done getting the remaining hosts for this loop 42613 1727204624.86464: getting the next task for host managed-node3 42613 1727204624.86478: done getting next task for host managed-node3 42613 1727204624.86482: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 42613 1727204624.86487: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204624.86492: getting variables 42613 1727204624.86494: in VariableManager get_vars() 42613 1727204624.86531: Calling all_inventory to load vars for managed-node3 42613 1727204624.86534: Calling groups_inventory to load vars for managed-node3 42613 1727204624.86539: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204624.86555: Calling all_plugins_play to load vars for managed-node3 42613 1727204624.86559: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204624.86563: Calling groups_plugins_play to load vars for managed-node3 42613 1727204624.89945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204624.92454: done with get_vars() 42613 1727204624.92530: done getting variables 42613 1727204624.92602: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204624.92746: variable 'profile' from source: include params 42613 1727204624.92750: variable 'interface' from source: set_fact 42613 1727204624.92820: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.088) 0:00:53.536 ***** 42613 1727204624.92861: entering _queue_task() for managed-node3/command 42613 1727204624.93270: worker is 1 (out of 1 available) 42613 1727204624.93398: exiting _queue_task() for managed-node3/command 42613 1727204624.93420: done queuing things up, now waiting for results queue to drain 42613 1727204624.93421: waiting for pending results... 42613 1727204624.93843: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 42613 1727204624.94270: in run() - task 127b8e07-fff9-2f91-05d8-000000000690 42613 1727204624.94485: variable 'ansible_search_path' from source: unknown 42613 1727204624.94489: variable 'ansible_search_path' from source: unknown 42613 1727204624.94492: calling self._execute() 42613 1727204624.94653: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204624.94714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204624.94732: variable 'omit' from source: magic vars 42613 1727204624.95634: variable 'ansible_distribution_major_version' from source: facts 42613 1727204624.95664: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204624.95841: variable 'profile_stat' from source: set_fact 42613 1727204624.95869: Evaluated conditional (profile_stat.stat.exists): False 42613 1727204624.95878: when evaluation is False, skipping this task 42613 1727204624.95906: _execute() done 42613 1727204624.96011: dumping result to json 42613 1727204624.96015: done dumping result, returning 42613 1727204624.96018: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [127b8e07-fff9-2f91-05d8-000000000690] 42613 1727204624.96021: sending task result for task 127b8e07-fff9-2f91-05d8-000000000690 42613 1727204624.96113: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000690 42613 1727204624.96374: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 42613 1727204624.96426: no more pending results, returning what we have 42613 1727204624.96430: results queue empty 42613 1727204624.96431: checking for any_errors_fatal 42613 1727204624.96438: done checking for any_errors_fatal 42613 1727204624.96439: checking for max_fail_percentage 42613 1727204624.96441: done checking for max_fail_percentage 42613 1727204624.96442: checking to see if all hosts have failed and the running result is not ok 42613 1727204624.96443: done checking to see if all hosts have failed 42613 1727204624.96444: getting the remaining hosts for this loop 42613 1727204624.96446: done getting the remaining hosts for this loop 42613 1727204624.96451: getting the next task for host managed-node3 42613 1727204624.96460: done getting next task for host managed-node3 42613 1727204624.96464: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 42613 1727204624.96470: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204624.96476: getting variables 42613 1727204624.96477: in VariableManager get_vars() 42613 1727204624.96512: Calling all_inventory to load vars for managed-node3 42613 1727204624.96516: Calling groups_inventory to load vars for managed-node3 42613 1727204624.96519: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204624.96532: Calling all_plugins_play to load vars for managed-node3 42613 1727204624.96535: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204624.96539: Calling groups_plugins_play to load vars for managed-node3 42613 1727204624.98715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.01074: done with get_vars() 42613 1727204625.01116: done getting variables 42613 1727204625.01194: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204625.01321: variable 'profile' from source: include params 42613 1727204625.01325: variable 'interface' from source: set_fact 42613 1727204625.01396: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.085) 0:00:53.622 ***** 42613 1727204625.01431: entering _queue_task() for managed-node3/set_fact 42613 1727204625.01857: worker is 1 (out of 1 available) 42613 1727204625.01875: exiting _queue_task() for managed-node3/set_fact 42613 1727204625.01888: done queuing things up, now waiting for results queue to drain 42613 1727204625.01889: waiting for pending results... 42613 1727204625.02177: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 42613 1727204625.02333: in run() - task 127b8e07-fff9-2f91-05d8-000000000691 42613 1727204625.02363: variable 'ansible_search_path' from source: unknown 42613 1727204625.02374: variable 'ansible_search_path' from source: unknown 42613 1727204625.02422: calling self._execute() 42613 1727204625.02545: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.02563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.02583: variable 'omit' from source: magic vars 42613 1727204625.03071: variable 'ansible_distribution_major_version' from source: facts 42613 1727204625.03075: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204625.03196: variable 'profile_stat' from source: set_fact 42613 1727204625.03232: Evaluated conditional (profile_stat.stat.exists): False 42613 1727204625.03242: when evaluation is False, skipping this task 42613 1727204625.03250: _execute() done 42613 1727204625.03258: dumping result to json 42613 1727204625.03328: done dumping result, returning 42613 1727204625.03332: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [127b8e07-fff9-2f91-05d8-000000000691] 42613 1727204625.03337: sending task result for task 127b8e07-fff9-2f91-05d8-000000000691 42613 1727204625.03417: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000691 42613 1727204625.03421: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 42613 1727204625.03479: no more pending results, returning what we have 42613 1727204625.03484: results queue empty 42613 1727204625.03485: checking for any_errors_fatal 42613 1727204625.03496: done checking for any_errors_fatal 42613 1727204625.03497: checking for max_fail_percentage 42613 1727204625.03499: done checking for max_fail_percentage 42613 1727204625.03500: checking to see if all hosts have failed and the running result is not ok 42613 1727204625.03501: done checking to see if all hosts have failed 42613 1727204625.03502: getting the remaining hosts for this loop 42613 1727204625.03504: done getting the remaining hosts for this loop 42613 1727204625.03509: getting the next task for host managed-node3 42613 1727204625.03518: done getting next task for host managed-node3 42613 1727204625.03521: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 42613 1727204625.03527: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204625.03532: getting variables 42613 1727204625.03534: in VariableManager get_vars() 42613 1727204625.03576: Calling all_inventory to load vars for managed-node3 42613 1727204625.03579: Calling groups_inventory to load vars for managed-node3 42613 1727204625.03584: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.03600: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.03604: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.03608: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.05821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.08151: done with get_vars() 42613 1727204625.08196: done getting variables 42613 1727204625.08274: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204625.08404: variable 'profile' from source: include params 42613 1727204625.08409: variable 'interface' from source: set_fact 42613 1727204625.08481: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.070) 0:00:53.693 ***** 42613 1727204625.08516: entering _queue_task() for managed-node3/command 42613 1727204625.09035: worker is 1 (out of 1 available) 42613 1727204625.09049: exiting _queue_task() for managed-node3/command 42613 1727204625.09062: done queuing things up, now waiting for results queue to drain 42613 1727204625.09064: waiting for pending results... 42613 1727204625.09501: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 42613 1727204625.09507: in run() - task 127b8e07-fff9-2f91-05d8-000000000692 42613 1727204625.09511: variable 'ansible_search_path' from source: unknown 42613 1727204625.09514: variable 'ansible_search_path' from source: unknown 42613 1727204625.09518: calling self._execute() 42613 1727204625.09644: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.09658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.09675: variable 'omit' from source: magic vars 42613 1727204625.10095: variable 'ansible_distribution_major_version' from source: facts 42613 1727204625.10114: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204625.10261: variable 'profile_stat' from source: set_fact 42613 1727204625.10285: Evaluated conditional (profile_stat.stat.exists): False 42613 1727204625.10293: when evaluation is False, skipping this task 42613 1727204625.10301: _execute() done 42613 1727204625.10309: dumping result to json 42613 1727204625.10359: done dumping result, returning 42613 1727204625.10363: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 [127b8e07-fff9-2f91-05d8-000000000692] 42613 1727204625.10367: sending task result for task 127b8e07-fff9-2f91-05d8-000000000692 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 42613 1727204625.10519: no more pending results, returning what we have 42613 1727204625.10525: results queue empty 42613 1727204625.10527: checking for any_errors_fatal 42613 1727204625.10536: done checking for any_errors_fatal 42613 1727204625.10537: checking for max_fail_percentage 42613 1727204625.10540: done checking for max_fail_percentage 42613 1727204625.10541: checking to see if all hosts have failed and the running result is not ok 42613 1727204625.10542: done checking to see if all hosts have failed 42613 1727204625.10543: getting the remaining hosts for this loop 42613 1727204625.10545: done getting the remaining hosts for this loop 42613 1727204625.10550: getting the next task for host managed-node3 42613 1727204625.10562: done getting next task for host managed-node3 42613 1727204625.10564: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 42613 1727204625.10572: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204625.10577: getting variables 42613 1727204625.10579: in VariableManager get_vars() 42613 1727204625.10615: Calling all_inventory to load vars for managed-node3 42613 1727204625.10619: Calling groups_inventory to load vars for managed-node3 42613 1727204625.10624: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.10640: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.10644: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.10648: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.11294: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000692 42613 1727204625.11298: WORKER PROCESS EXITING 42613 1727204625.13083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.15421: done with get_vars() 42613 1727204625.15472: done getting variables 42613 1727204625.15542: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204625.15677: variable 'profile' from source: include params 42613 1727204625.15681: variable 'interface' from source: set_fact 42613 1727204625.15751: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.072) 0:00:53.765 ***** 42613 1727204625.15786: entering _queue_task() for managed-node3/set_fact 42613 1727204625.16209: worker is 1 (out of 1 available) 42613 1727204625.16224: exiting _queue_task() for managed-node3/set_fact 42613 1727204625.16480: done queuing things up, now waiting for results queue to drain 42613 1727204625.16482: waiting for pending results... 42613 1727204625.16547: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 42613 1727204625.16803: in run() - task 127b8e07-fff9-2f91-05d8-000000000693 42613 1727204625.16807: variable 'ansible_search_path' from source: unknown 42613 1727204625.16815: variable 'ansible_search_path' from source: unknown 42613 1727204625.16819: calling self._execute() 42613 1727204625.16908: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.16926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.16941: variable 'omit' from source: magic vars 42613 1727204625.17377: variable 'ansible_distribution_major_version' from source: facts 42613 1727204625.17395: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204625.17547: variable 'profile_stat' from source: set_fact 42613 1727204625.17580: Evaluated conditional (profile_stat.stat.exists): False 42613 1727204625.17589: when evaluation is False, skipping this task 42613 1727204625.17596: _execute() done 42613 1727204625.17674: dumping result to json 42613 1727204625.17680: done dumping result, returning 42613 1727204625.17683: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [127b8e07-fff9-2f91-05d8-000000000693] 42613 1727204625.17685: sending task result for task 127b8e07-fff9-2f91-05d8-000000000693 42613 1727204625.17771: done sending task result for task 127b8e07-fff9-2f91-05d8-000000000693 42613 1727204625.17774: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 42613 1727204625.17832: no more pending results, returning what we have 42613 1727204625.17838: results queue empty 42613 1727204625.17839: checking for any_errors_fatal 42613 1727204625.17849: done checking for any_errors_fatal 42613 1727204625.17850: checking for max_fail_percentage 42613 1727204625.17853: done checking for max_fail_percentage 42613 1727204625.17854: checking to see if all hosts have failed and the running result is not ok 42613 1727204625.17855: done checking to see if all hosts have failed 42613 1727204625.17856: getting the remaining hosts for this loop 42613 1727204625.17858: done getting the remaining hosts for this loop 42613 1727204625.17864: getting the next task for host managed-node3 42613 1727204625.17877: done getting next task for host managed-node3 42613 1727204625.17882: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 42613 1727204625.17886: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204625.17893: getting variables 42613 1727204625.17895: in VariableManager get_vars() 42613 1727204625.17934: Calling all_inventory to load vars for managed-node3 42613 1727204625.17937: Calling groups_inventory to load vars for managed-node3 42613 1727204625.17942: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.17959: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.17963: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.18086: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.20420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.22739: done with get_vars() 42613 1727204625.22789: done getting variables 42613 1727204625.22859: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204625.22992: variable 'profile' from source: include params 42613 1727204625.22997: variable 'interface' from source: set_fact 42613 1727204625.23062: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.073) 0:00:53.839 ***** 42613 1727204625.23104: entering _queue_task() for managed-node3/assert 42613 1727204625.23629: worker is 1 (out of 1 available) 42613 1727204625.23643: exiting _queue_task() for managed-node3/assert 42613 1727204625.23656: done queuing things up, now waiting for results queue to drain 42613 1727204625.23658: waiting for pending results... 42613 1727204625.24084: running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' 42613 1727204625.24090: in run() - task 127b8e07-fff9-2f91-05d8-00000000067c 42613 1727204625.24094: variable 'ansible_search_path' from source: unknown 42613 1727204625.24097: variable 'ansible_search_path' from source: unknown 42613 1727204625.24100: calling self._execute() 42613 1727204625.24210: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.24223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.24239: variable 'omit' from source: magic vars 42613 1727204625.24682: variable 'ansible_distribution_major_version' from source: facts 42613 1727204625.24701: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204625.24720: variable 'omit' from source: magic vars 42613 1727204625.24831: variable 'omit' from source: magic vars 42613 1727204625.24902: variable 'profile' from source: include params 42613 1727204625.24912: variable 'interface' from source: set_fact 42613 1727204625.24993: variable 'interface' from source: set_fact 42613 1727204625.25020: variable 'omit' from source: magic vars 42613 1727204625.25078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204625.25118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204625.25143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204625.25174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204625.25190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204625.25267: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204625.25274: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.25276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.25387: Set connection var ansible_shell_executable to /bin/sh 42613 1727204625.25398: Set connection var ansible_pipelining to False 42613 1727204625.25412: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204625.25419: Set connection var ansible_connection to ssh 42613 1727204625.25429: Set connection var ansible_timeout to 10 42613 1727204625.25486: Set connection var ansible_shell_type to sh 42613 1727204625.25489: variable 'ansible_shell_executable' from source: unknown 42613 1727204625.25491: variable 'ansible_connection' from source: unknown 42613 1727204625.25494: variable 'ansible_module_compression' from source: unknown 42613 1727204625.25496: variable 'ansible_shell_type' from source: unknown 42613 1727204625.25498: variable 'ansible_shell_executable' from source: unknown 42613 1727204625.25500: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.25508: variable 'ansible_pipelining' from source: unknown 42613 1727204625.25516: variable 'ansible_timeout' from source: unknown 42613 1727204625.25524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.25705: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204625.25725: variable 'omit' from source: magic vars 42613 1727204625.25735: starting attempt loop 42613 1727204625.25772: running the handler 42613 1727204625.25912: variable 'lsr_net_profile_exists' from source: set_fact 42613 1727204625.25933: Evaluated conditional (not lsr_net_profile_exists): True 42613 1727204625.25946: handler run complete 42613 1727204625.26033: attempt loop complete, returning result 42613 1727204625.26037: _execute() done 42613 1727204625.26039: dumping result to json 42613 1727204625.26042: done dumping result, returning 42613 1727204625.26045: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' [127b8e07-fff9-2f91-05d8-00000000067c] 42613 1727204625.26047: sending task result for task 127b8e07-fff9-2f91-05d8-00000000067c ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204625.26196: no more pending results, returning what we have 42613 1727204625.26201: results queue empty 42613 1727204625.26203: checking for any_errors_fatal 42613 1727204625.26210: done checking for any_errors_fatal 42613 1727204625.26211: checking for max_fail_percentage 42613 1727204625.26213: done checking for max_fail_percentage 42613 1727204625.26215: checking to see if all hosts have failed and the running result is not ok 42613 1727204625.26216: done checking to see if all hosts have failed 42613 1727204625.26217: getting the remaining hosts for this loop 42613 1727204625.26219: done getting the remaining hosts for this loop 42613 1727204625.26224: getting the next task for host managed-node3 42613 1727204625.26235: done getting next task for host managed-node3 42613 1727204625.26239: ^ task is: TASK: Include the task 'assert_device_absent.yml' 42613 1727204625.26354: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204625.26360: getting variables 42613 1727204625.26362: in VariableManager get_vars() 42613 1727204625.26400: Calling all_inventory to load vars for managed-node3 42613 1727204625.26404: Calling groups_inventory to load vars for managed-node3 42613 1727204625.26409: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.26424: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.26428: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.26432: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.26986: done sending task result for task 127b8e07-fff9-2f91-05d8-00000000067c 42613 1727204625.26992: WORKER PROCESS EXITING 42613 1727204625.28640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.30915: done with get_vars() 42613 1727204625.30962: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:234 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.079) 0:00:53.918 ***** 42613 1727204625.31075: entering _queue_task() for managed-node3/include_tasks 42613 1727204625.31491: worker is 1 (out of 1 available) 42613 1727204625.31507: exiting _queue_task() for managed-node3/include_tasks 42613 1727204625.31521: done queuing things up, now waiting for results queue to drain 42613 1727204625.31522: waiting for pending results... 42613 1727204625.31845: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' 42613 1727204625.31977: in run() - task 127b8e07-fff9-2f91-05d8-0000000000aa 42613 1727204625.32005: variable 'ansible_search_path' from source: unknown 42613 1727204625.32053: calling self._execute() 42613 1727204625.32169: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.32182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.32196: variable 'omit' from source: magic vars 42613 1727204625.32625: variable 'ansible_distribution_major_version' from source: facts 42613 1727204625.32650: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204625.32662: _execute() done 42613 1727204625.32672: dumping result to json 42613 1727204625.32679: done dumping result, returning 42613 1727204625.32690: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' [127b8e07-fff9-2f91-05d8-0000000000aa] 42613 1727204625.32759: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000aa 42613 1727204625.32854: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000aa 42613 1727204625.32971: WORKER PROCESS EXITING 42613 1727204625.33008: no more pending results, returning what we have 42613 1727204625.33015: in VariableManager get_vars() 42613 1727204625.33059: Calling all_inventory to load vars for managed-node3 42613 1727204625.33063: Calling groups_inventory to load vars for managed-node3 42613 1727204625.33070: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.33094: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.33098: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.33102: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.35452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.38018: done with get_vars() 42613 1727204625.38062: variable 'ansible_search_path' from source: unknown 42613 1727204625.38083: we have included files to process 42613 1727204625.38084: generating all_blocks data 42613 1727204625.38086: done generating all_blocks data 42613 1727204625.38093: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 42613 1727204625.38094: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 42613 1727204625.38097: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 42613 1727204625.38499: in VariableManager get_vars() 42613 1727204625.38519: done with get_vars() 42613 1727204625.38775: done processing included file 42613 1727204625.38777: iterating over new_blocks loaded from include file 42613 1727204625.38779: in VariableManager get_vars() 42613 1727204625.38793: done with get_vars() 42613 1727204625.38795: filtering new block on tags 42613 1727204625.38887: done filtering new block on tags 42613 1727204625.38890: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node3 42613 1727204625.38896: extending task lists for all hosts with included blocks 42613 1727204625.39263: done extending task lists 42613 1727204625.39265: done processing included files 42613 1727204625.39268: results queue empty 42613 1727204625.39269: checking for any_errors_fatal 42613 1727204625.39273: done checking for any_errors_fatal 42613 1727204625.39274: checking for max_fail_percentage 42613 1727204625.39275: done checking for max_fail_percentage 42613 1727204625.39369: checking to see if all hosts have failed and the running result is not ok 42613 1727204625.39371: done checking to see if all hosts have failed 42613 1727204625.39372: getting the remaining hosts for this loop 42613 1727204625.39373: done getting the remaining hosts for this loop 42613 1727204625.39377: getting the next task for host managed-node3 42613 1727204625.39382: done getting next task for host managed-node3 42613 1727204625.39384: ^ task is: TASK: Include the task 'get_interface_stat.yml' 42613 1727204625.39391: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204625.39394: getting variables 42613 1727204625.39396: in VariableManager get_vars() 42613 1727204625.39411: Calling all_inventory to load vars for managed-node3 42613 1727204625.39413: Calling groups_inventory to load vars for managed-node3 42613 1727204625.39416: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.39422: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.39425: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.39428: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.42330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.44175: done with get_vars() 42613 1727204625.44217: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.132) 0:00:54.051 ***** 42613 1727204625.44310: entering _queue_task() for managed-node3/include_tasks 42613 1727204625.44641: worker is 1 (out of 1 available) 42613 1727204625.44656: exiting _queue_task() for managed-node3/include_tasks 42613 1727204625.44671: done queuing things up, now waiting for results queue to drain 42613 1727204625.44673: waiting for pending results... 42613 1727204625.44895: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 42613 1727204625.44985: in run() - task 127b8e07-fff9-2f91-05d8-0000000006c4 42613 1727204625.45002: variable 'ansible_search_path' from source: unknown 42613 1727204625.45005: variable 'ansible_search_path' from source: unknown 42613 1727204625.45043: calling self._execute() 42613 1727204625.45129: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.45136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.45146: variable 'omit' from source: magic vars 42613 1727204625.45452: variable 'ansible_distribution_major_version' from source: facts 42613 1727204625.45463: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204625.45470: _execute() done 42613 1727204625.45473: dumping result to json 42613 1727204625.45478: done dumping result, returning 42613 1727204625.45484: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-2f91-05d8-0000000006c4] 42613 1727204625.45490: sending task result for task 127b8e07-fff9-2f91-05d8-0000000006c4 42613 1727204625.45592: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000006c4 42613 1727204625.45596: WORKER PROCESS EXITING 42613 1727204625.45630: no more pending results, returning what we have 42613 1727204625.45635: in VariableManager get_vars() 42613 1727204625.45679: Calling all_inventory to load vars for managed-node3 42613 1727204625.45683: Calling groups_inventory to load vars for managed-node3 42613 1727204625.45686: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.45702: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.45712: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.45716: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.47008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.49308: done with get_vars() 42613 1727204625.49347: variable 'ansible_search_path' from source: unknown 42613 1727204625.49348: variable 'ansible_search_path' from source: unknown 42613 1727204625.49395: we have included files to process 42613 1727204625.49396: generating all_blocks data 42613 1727204625.49398: done generating all_blocks data 42613 1727204625.49399: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 42613 1727204625.49401: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 42613 1727204625.49403: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 42613 1727204625.49615: done processing included file 42613 1727204625.49617: iterating over new_blocks loaded from include file 42613 1727204625.49619: in VariableManager get_vars() 42613 1727204625.49635: done with get_vars() 42613 1727204625.49637: filtering new block on tags 42613 1727204625.49657: done filtering new block on tags 42613 1727204625.49659: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 42613 1727204625.49667: extending task lists for all hosts with included blocks 42613 1727204625.49780: done extending task lists 42613 1727204625.49781: done processing included files 42613 1727204625.49782: results queue empty 42613 1727204625.49783: checking for any_errors_fatal 42613 1727204625.49787: done checking for any_errors_fatal 42613 1727204625.49788: checking for max_fail_percentage 42613 1727204625.49790: done checking for max_fail_percentage 42613 1727204625.49791: checking to see if all hosts have failed and the running result is not ok 42613 1727204625.49791: done checking to see if all hosts have failed 42613 1727204625.49792: getting the remaining hosts for this loop 42613 1727204625.49793: done getting the remaining hosts for this loop 42613 1727204625.49796: getting the next task for host managed-node3 42613 1727204625.49801: done getting next task for host managed-node3 42613 1727204625.49803: ^ task is: TASK: Get stat for interface {{ interface }} 42613 1727204625.49806: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204625.49809: getting variables 42613 1727204625.49810: in VariableManager get_vars() 42613 1727204625.49819: Calling all_inventory to load vars for managed-node3 42613 1727204625.49822: Calling groups_inventory to load vars for managed-node3 42613 1727204625.49824: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.49830: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.49833: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.49836: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.51607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.53928: done with get_vars() 42613 1727204625.53983: done getting variables 42613 1727204625.54177: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.098) 0:00:54.150 ***** 42613 1727204625.54211: entering _queue_task() for managed-node3/stat 42613 1727204625.54623: worker is 1 (out of 1 available) 42613 1727204625.54637: exiting _queue_task() for managed-node3/stat 42613 1727204625.54653: done queuing things up, now waiting for results queue to drain 42613 1727204625.54655: waiting for pending results... 42613 1727204625.55088: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 42613 1727204625.55115: in run() - task 127b8e07-fff9-2f91-05d8-0000000006de 42613 1727204625.55184: variable 'ansible_search_path' from source: unknown 42613 1727204625.55187: variable 'ansible_search_path' from source: unknown 42613 1727204625.55199: calling self._execute() 42613 1727204625.55319: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.55332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.55350: variable 'omit' from source: magic vars 42613 1727204625.55793: variable 'ansible_distribution_major_version' from source: facts 42613 1727204625.55811: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204625.55832: variable 'omit' from source: magic vars 42613 1727204625.55944: variable 'omit' from source: magic vars 42613 1727204625.56017: variable 'interface' from source: set_fact 42613 1727204625.56050: variable 'omit' from source: magic vars 42613 1727204625.56106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204625.56159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204625.56192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204625.56216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204625.56267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204625.56277: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204625.56285: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.56293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.56412: Set connection var ansible_shell_executable to /bin/sh 42613 1727204625.56426: Set connection var ansible_pipelining to False 42613 1727204625.56444: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204625.56482: Set connection var ansible_connection to ssh 42613 1727204625.56485: Set connection var ansible_timeout to 10 42613 1727204625.56488: Set connection var ansible_shell_type to sh 42613 1727204625.56499: variable 'ansible_shell_executable' from source: unknown 42613 1727204625.56507: variable 'ansible_connection' from source: unknown 42613 1727204625.56513: variable 'ansible_module_compression' from source: unknown 42613 1727204625.56590: variable 'ansible_shell_type' from source: unknown 42613 1727204625.56594: variable 'ansible_shell_executable' from source: unknown 42613 1727204625.56596: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204625.56599: variable 'ansible_pipelining' from source: unknown 42613 1727204625.56601: variable 'ansible_timeout' from source: unknown 42613 1727204625.56603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204625.56804: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 42613 1727204625.56828: variable 'omit' from source: magic vars 42613 1727204625.56843: starting attempt loop 42613 1727204625.56852: running the handler 42613 1727204625.56875: _low_level_execute_command(): starting 42613 1727204625.56889: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204625.57706: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204625.57778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204625.57872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204625.57876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204625.57878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204625.57978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204625.59843: stdout chunk (state=3): >>>/root <<< 42613 1727204625.60082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204625.60086: stdout chunk (state=3): >>><<< 42613 1727204625.60089: stderr chunk (state=3): >>><<< 42613 1727204625.60092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204625.60095: _low_level_execute_command(): starting 42613 1727204625.60098: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694 `" && echo ansible-tmp-1727204625.600539-46286-31314416011694="` echo /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694 `" ) && sleep 0' 42613 1727204625.60760: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204625.60768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204625.60782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204625.60801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204625.60814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204625.60837: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204625.60886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204625.60944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204625.60955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204625.60977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204625.61095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204625.63272: stdout chunk (state=3): >>>ansible-tmp-1727204625.600539-46286-31314416011694=/root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694 <<< 42613 1727204625.63386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204625.63447: stderr chunk (state=3): >>><<< 42613 1727204625.63451: stdout chunk (state=3): >>><<< 42613 1727204625.63473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204625.600539-46286-31314416011694=/root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204625.63551: variable 'ansible_module_compression' from source: unknown 42613 1727204625.63637: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 42613 1727204625.63667: variable 'ansible_facts' from source: unknown 42613 1727204625.63735: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/AnsiballZ_stat.py 42613 1727204625.64172: Sending initial data 42613 1727204625.64175: Sent initial data (151 bytes) 42613 1727204625.64533: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204625.64542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204625.64556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204625.64654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204625.64674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204625.64777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204625.66564: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204625.66621: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204625.66702: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmpax731c8s /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/AnsiballZ_stat.py <<< 42613 1727204625.66706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/AnsiballZ_stat.py" <<< 42613 1727204625.66768: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmpax731c8s" to remote "/root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/AnsiballZ_stat.py" <<< 42613 1727204625.66771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/AnsiballZ_stat.py" <<< 42613 1727204625.67537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204625.67577: stderr chunk (state=3): >>><<< 42613 1727204625.67710: stdout chunk (state=3): >>><<< 42613 1727204625.67714: done transferring module to remote 42613 1727204625.67716: _low_level_execute_command(): starting 42613 1727204625.67719: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/ /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/AnsiballZ_stat.py && sleep 0' 42613 1727204625.68427: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204625.68474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204625.68478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204625.68501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204625.68592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204625.70622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204625.70684: stderr chunk (state=3): >>><<< 42613 1727204625.70688: stdout chunk (state=3): >>><<< 42613 1727204625.70704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204625.70708: _low_level_execute_command(): starting 42613 1727204625.70712: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/AnsiballZ_stat.py && sleep 0' 42613 1727204625.71224: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204625.71228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204625.71231: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204625.71290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204625.71294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204625.71300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204625.71377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204625.88885: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 42613 1727204625.90501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204625.90562: stderr chunk (state=3): >>><<< 42613 1727204625.90569: stdout chunk (state=3): >>><<< 42613 1727204625.90587: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204625.90611: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204625.90622: _low_level_execute_command(): starting 42613 1727204625.90627: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204625.600539-46286-31314416011694/ > /dev/null 2>&1 && sleep 0' 42613 1727204625.91328: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204625.91390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204625.91595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204625.93659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204625.93664: stdout chunk (state=3): >>><<< 42613 1727204625.93668: stderr chunk (state=3): >>><<< 42613 1727204625.93690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204625.93872: handler run complete 42613 1727204625.93876: attempt loop complete, returning result 42613 1727204625.93878: _execute() done 42613 1727204625.93881: dumping result to json 42613 1727204625.93883: done dumping result, returning 42613 1727204625.93885: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [127b8e07-fff9-2f91-05d8-0000000006de] 42613 1727204625.93887: sending task result for task 127b8e07-fff9-2f91-05d8-0000000006de 42613 1727204625.93969: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000006de 42613 1727204625.93973: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 42613 1727204625.94038: no more pending results, returning what we have 42613 1727204625.94045: results queue empty 42613 1727204625.94046: checking for any_errors_fatal 42613 1727204625.94048: done checking for any_errors_fatal 42613 1727204625.94049: checking for max_fail_percentage 42613 1727204625.94051: done checking for max_fail_percentage 42613 1727204625.94052: checking to see if all hosts have failed and the running result is not ok 42613 1727204625.94052: done checking to see if all hosts have failed 42613 1727204625.94053: getting the remaining hosts for this loop 42613 1727204625.94055: done getting the remaining hosts for this loop 42613 1727204625.94060: getting the next task for host managed-node3 42613 1727204625.94073: done getting next task for host managed-node3 42613 1727204625.94078: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 42613 1727204625.94081: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204625.94086: getting variables 42613 1727204625.94087: in VariableManager get_vars() 42613 1727204625.94121: Calling all_inventory to load vars for managed-node3 42613 1727204625.94123: Calling groups_inventory to load vars for managed-node3 42613 1727204625.94127: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204625.94138: Calling all_plugins_play to load vars for managed-node3 42613 1727204625.94144: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204625.94146: Calling groups_plugins_play to load vars for managed-node3 42613 1727204625.96318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204625.99085: done with get_vars() 42613 1727204625.99130: done getting variables 42613 1727204625.99204: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 42613 1727204625.99339: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.451) 0:00:54.601 ***** 42613 1727204625.99374: entering _queue_task() for managed-node3/assert 42613 1727204625.99773: worker is 1 (out of 1 available) 42613 1727204625.99788: exiting _queue_task() for managed-node3/assert 42613 1727204625.99802: done queuing things up, now waiting for results queue to drain 42613 1727204625.99804: waiting for pending results... 42613 1727204626.00189: running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' 42613 1727204626.00263: in run() - task 127b8e07-fff9-2f91-05d8-0000000006c5 42613 1727204626.00294: variable 'ansible_search_path' from source: unknown 42613 1727204626.00303: variable 'ansible_search_path' from source: unknown 42613 1727204626.00350: calling self._execute() 42613 1727204626.00468: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.00484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.00503: variable 'omit' from source: magic vars 42613 1727204626.00973: variable 'ansible_distribution_major_version' from source: facts 42613 1727204626.00996: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204626.01008: variable 'omit' from source: magic vars 42613 1727204626.01061: variable 'omit' from source: magic vars 42613 1727204626.01185: variable 'interface' from source: set_fact 42613 1727204626.01213: variable 'omit' from source: magic vars 42613 1727204626.01278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204626.01326: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204626.01356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204626.01384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204626.01399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204626.01433: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204626.01441: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.01449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.01568: Set connection var ansible_shell_executable to /bin/sh 42613 1727204626.01579: Set connection var ansible_pipelining to False 42613 1727204626.01698: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204626.01701: Set connection var ansible_connection to ssh 42613 1727204626.01704: Set connection var ansible_timeout to 10 42613 1727204626.01706: Set connection var ansible_shell_type to sh 42613 1727204626.01708: variable 'ansible_shell_executable' from source: unknown 42613 1727204626.01710: variable 'ansible_connection' from source: unknown 42613 1727204626.01712: variable 'ansible_module_compression' from source: unknown 42613 1727204626.01714: variable 'ansible_shell_type' from source: unknown 42613 1727204626.01716: variable 'ansible_shell_executable' from source: unknown 42613 1727204626.01718: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.01719: variable 'ansible_pipelining' from source: unknown 42613 1727204626.01721: variable 'ansible_timeout' from source: unknown 42613 1727204626.01723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.01838: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204626.01857: variable 'omit' from source: magic vars 42613 1727204626.01869: starting attempt loop 42613 1727204626.01876: running the handler 42613 1727204626.02059: variable 'interface_stat' from source: set_fact 42613 1727204626.02076: Evaluated conditional (not interface_stat.stat.exists): True 42613 1727204626.02087: handler run complete 42613 1727204626.02108: attempt loop complete, returning result 42613 1727204626.02114: _execute() done 42613 1727204626.02119: dumping result to json 42613 1727204626.02127: done dumping result, returning 42613 1727204626.02140: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' [127b8e07-fff9-2f91-05d8-0000000006c5] 42613 1727204626.02150: sending task result for task 127b8e07-fff9-2f91-05d8-0000000006c5 42613 1727204626.02497: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000006c5 42613 1727204626.02500: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 42613 1727204626.02549: no more pending results, returning what we have 42613 1727204626.02553: results queue empty 42613 1727204626.02554: checking for any_errors_fatal 42613 1727204626.02561: done checking for any_errors_fatal 42613 1727204626.02562: checking for max_fail_percentage 42613 1727204626.02564: done checking for max_fail_percentage 42613 1727204626.02569: checking to see if all hosts have failed and the running result is not ok 42613 1727204626.02571: done checking to see if all hosts have failed 42613 1727204626.02572: getting the remaining hosts for this loop 42613 1727204626.02573: done getting the remaining hosts for this loop 42613 1727204626.02578: getting the next task for host managed-node3 42613 1727204626.02585: done getting next task for host managed-node3 42613 1727204626.02588: ^ task is: TASK: Verify network state restored to default 42613 1727204626.02590: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204626.02594: getting variables 42613 1727204626.02596: in VariableManager get_vars() 42613 1727204626.02628: Calling all_inventory to load vars for managed-node3 42613 1727204626.02632: Calling groups_inventory to load vars for managed-node3 42613 1727204626.02636: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.02647: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.02651: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.02654: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.04489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.06788: done with get_vars() 42613 1727204626.06821: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:236 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.075) 0:00:54.677 ***** 42613 1727204626.06933: entering _queue_task() for managed-node3/include_tasks 42613 1727204626.07333: worker is 1 (out of 1 available) 42613 1727204626.07349: exiting _queue_task() for managed-node3/include_tasks 42613 1727204626.07364: done queuing things up, now waiting for results queue to drain 42613 1727204626.07569: waiting for pending results... 42613 1727204626.07690: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 42613 1727204626.07816: in run() - task 127b8e07-fff9-2f91-05d8-0000000000ab 42613 1727204626.07841: variable 'ansible_search_path' from source: unknown 42613 1727204626.07891: calling self._execute() 42613 1727204626.08012: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.08031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.08050: variable 'omit' from source: magic vars 42613 1727204626.08576: variable 'ansible_distribution_major_version' from source: facts 42613 1727204626.08581: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204626.08585: _execute() done 42613 1727204626.08588: dumping result to json 42613 1727204626.08590: done dumping result, returning 42613 1727204626.08593: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [127b8e07-fff9-2f91-05d8-0000000000ab] 42613 1727204626.08594: sending task result for task 127b8e07-fff9-2f91-05d8-0000000000ab 42613 1727204626.08868: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000000ab 42613 1727204626.08875: WORKER PROCESS EXITING 42613 1727204626.08906: no more pending results, returning what we have 42613 1727204626.08911: in VariableManager get_vars() 42613 1727204626.08954: Calling all_inventory to load vars for managed-node3 42613 1727204626.08957: Calling groups_inventory to load vars for managed-node3 42613 1727204626.08961: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.08979: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.08982: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.08986: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.16637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.18828: done with get_vars() 42613 1727204626.18867: variable 'ansible_search_path' from source: unknown 42613 1727204626.18885: we have included files to process 42613 1727204626.18886: generating all_blocks data 42613 1727204626.18887: done generating all_blocks data 42613 1727204626.18891: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 42613 1727204626.18892: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 42613 1727204626.18895: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 42613 1727204626.19301: done processing included file 42613 1727204626.19304: iterating over new_blocks loaded from include file 42613 1727204626.19305: in VariableManager get_vars() 42613 1727204626.19318: done with get_vars() 42613 1727204626.19320: filtering new block on tags 42613 1727204626.19338: done filtering new block on tags 42613 1727204626.19340: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 42613 1727204626.19345: extending task lists for all hosts with included blocks 42613 1727204626.19629: done extending task lists 42613 1727204626.19630: done processing included files 42613 1727204626.19631: results queue empty 42613 1727204626.19632: checking for any_errors_fatal 42613 1727204626.19635: done checking for any_errors_fatal 42613 1727204626.19636: checking for max_fail_percentage 42613 1727204626.19638: done checking for max_fail_percentage 42613 1727204626.19639: checking to see if all hosts have failed and the running result is not ok 42613 1727204626.19640: done checking to see if all hosts have failed 42613 1727204626.19640: getting the remaining hosts for this loop 42613 1727204626.19642: done getting the remaining hosts for this loop 42613 1727204626.19644: getting the next task for host managed-node3 42613 1727204626.19648: done getting next task for host managed-node3 42613 1727204626.19650: ^ task is: TASK: Check routes and DNS 42613 1727204626.19653: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204626.19655: getting variables 42613 1727204626.19656: in VariableManager get_vars() 42613 1727204626.19669: Calling all_inventory to load vars for managed-node3 42613 1727204626.19672: Calling groups_inventory to load vars for managed-node3 42613 1727204626.19675: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.19681: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.19684: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.19687: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.21309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.23513: done with get_vars() 42613 1727204626.23556: done getting variables 42613 1727204626.23612: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.167) 0:00:54.844 ***** 42613 1727204626.23642: entering _queue_task() for managed-node3/shell 42613 1727204626.23987: worker is 1 (out of 1 available) 42613 1727204626.24003: exiting _queue_task() for managed-node3/shell 42613 1727204626.24017: done queuing things up, now waiting for results queue to drain 42613 1727204626.24019: waiting for pending results... 42613 1727204626.24251: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 42613 1727204626.24345: in run() - task 127b8e07-fff9-2f91-05d8-0000000006f6 42613 1727204626.24355: variable 'ansible_search_path' from source: unknown 42613 1727204626.24359: variable 'ansible_search_path' from source: unknown 42613 1727204626.24398: calling self._execute() 42613 1727204626.24483: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.24488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.24500: variable 'omit' from source: magic vars 42613 1727204626.24820: variable 'ansible_distribution_major_version' from source: facts 42613 1727204626.24833: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204626.24842: variable 'omit' from source: magic vars 42613 1727204626.24874: variable 'omit' from source: magic vars 42613 1727204626.24902: variable 'omit' from source: magic vars 42613 1727204626.24946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 42613 1727204626.24977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 42613 1727204626.24995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 42613 1727204626.25010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204626.25023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 42613 1727204626.25050: variable 'inventory_hostname' from source: host vars for 'managed-node3' 42613 1727204626.25053: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.25057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.25139: Set connection var ansible_shell_executable to /bin/sh 42613 1727204626.25145: Set connection var ansible_pipelining to False 42613 1727204626.25147: Set connection var ansible_module_compression to ZIP_DEFLATED 42613 1727204626.25156: Set connection var ansible_connection to ssh 42613 1727204626.25158: Set connection var ansible_timeout to 10 42613 1727204626.25161: Set connection var ansible_shell_type to sh 42613 1727204626.25183: variable 'ansible_shell_executable' from source: unknown 42613 1727204626.25186: variable 'ansible_connection' from source: unknown 42613 1727204626.25189: variable 'ansible_module_compression' from source: unknown 42613 1727204626.25191: variable 'ansible_shell_type' from source: unknown 42613 1727204626.25193: variable 'ansible_shell_executable' from source: unknown 42613 1727204626.25196: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.25200: variable 'ansible_pipelining' from source: unknown 42613 1727204626.25202: variable 'ansible_timeout' from source: unknown 42613 1727204626.25207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.25325: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204626.25336: variable 'omit' from source: magic vars 42613 1727204626.25343: starting attempt loop 42613 1727204626.25347: running the handler 42613 1727204626.25356: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 42613 1727204626.25374: _low_level_execute_command(): starting 42613 1727204626.25383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 42613 1727204626.26101: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204626.26199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204626.26230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204626.26233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204626.26267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204626.26375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204626.28210: stdout chunk (state=3): >>>/root <<< 42613 1727204626.28320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204626.28377: stderr chunk (state=3): >>><<< 42613 1727204626.28381: stdout chunk (state=3): >>><<< 42613 1727204626.28404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204626.28418: _low_level_execute_command(): starting 42613 1727204626.28425: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676 `" && echo ansible-tmp-1727204626.284038-46306-121200873196676="` echo /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676 `" ) && sleep 0' 42613 1727204626.29041: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204626.29053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204626.29075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204626.29191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204626.31353: stdout chunk (state=3): >>>ansible-tmp-1727204626.284038-46306-121200873196676=/root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676 <<< 42613 1727204626.31457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204626.31523: stderr chunk (state=3): >>><<< 42613 1727204626.31528: stdout chunk (state=3): >>><<< 42613 1727204626.31541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204626.284038-46306-121200873196676=/root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204626.31576: variable 'ansible_module_compression' from source: unknown 42613 1727204626.31620: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-42613igbktgy6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 42613 1727204626.31663: variable 'ansible_facts' from source: unknown 42613 1727204626.31715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/AnsiballZ_command.py 42613 1727204626.31941: Sending initial data 42613 1727204626.31944: Sent initial data (155 bytes) 42613 1727204626.32594: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204626.32615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 42613 1727204626.32618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204626.32675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204626.32679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204626.32702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204626.32775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204626.34572: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 42613 1727204626.34633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 42613 1727204626.34699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-42613igbktgy6/tmp08iwib09 /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/AnsiballZ_command.py <<< 42613 1727204626.34706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/AnsiballZ_command.py" <<< 42613 1727204626.34768: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-42613igbktgy6/tmp08iwib09" to remote "/root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/AnsiballZ_command.py" <<< 42613 1727204626.34775: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/AnsiballZ_command.py" <<< 42613 1727204626.35454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204626.35586: stderr chunk (state=3): >>><<< 42613 1727204626.35589: stdout chunk (state=3): >>><<< 42613 1727204626.35710: done transferring module to remote 42613 1727204626.35715: _low_level_execute_command(): starting 42613 1727204626.35717: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/ /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/AnsiballZ_command.py && sleep 0' 42613 1727204626.36319: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204626.36348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204626.36383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204626.36484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204626.36488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204626.36505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204626.36647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204626.38696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204626.38753: stderr chunk (state=3): >>><<< 42613 1727204626.38757: stdout chunk (state=3): >>><<< 42613 1727204626.38775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204626.38778: _low_level_execute_command(): starting 42613 1727204626.38785: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/AnsiballZ_command.py && sleep 0' 42613 1727204626.39272: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204626.39276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204626.39302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204626.39305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204626.39373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204626.39377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204626.39380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204626.39463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204626.58057: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2671sec preferred_lft 2671sec\n inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:03:46.569138", "end": "2024-09-24 15:03:46.579004", "delta": "0:00:00.009866", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 42613 1727204626.60076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 42613 1727204626.60080: stdout chunk (state=3): >>><<< 42613 1727204626.60082: stderr chunk (state=3): >>><<< 42613 1727204626.60085: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2671sec preferred_lft 2671sec\n inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:03:46.569138", "end": "2024-09-24 15:03:46.579004", "delta": "0:00:00.009866", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 42613 1727204626.60096: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 42613 1727204626.60098: _low_level_execute_command(): starting 42613 1727204626.60100: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204626.284038-46306-121200873196676/ > /dev/null 2>&1 && sleep 0' 42613 1727204626.60696: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 42613 1727204626.60704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204626.60716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204626.60732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204626.60748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204626.60758: stderr chunk (state=3): >>>debug2: match not found <<< 42613 1727204626.60784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204626.60788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 42613 1727204626.60791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 42613 1727204626.60837: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 42613 1727204626.60841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 42613 1727204626.60843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 42613 1727204626.60846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 42613 1727204626.60848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 42613 1727204626.60850: stderr chunk (state=3): >>>debug2: match found <<< 42613 1727204626.60852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 42613 1727204626.60928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 42613 1727204626.60943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 42613 1727204626.60968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 42613 1727204626.61072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 42613 1727204626.63146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 42613 1727204626.63247: stderr chunk (state=3): >>><<< 42613 1727204626.63370: stdout chunk (state=3): >>><<< 42613 1727204626.63374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 42613 1727204626.63377: handler run complete 42613 1727204626.63379: Evaluated conditional (False): False 42613 1727204626.63382: attempt loop complete, returning result 42613 1727204626.63384: _execute() done 42613 1727204626.63386: dumping result to json 42613 1727204626.63392: done dumping result, returning 42613 1727204626.63395: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [127b8e07-fff9-2f91-05d8-0000000006f6] 42613 1727204626.63397: sending task result for task 127b8e07-fff9-2f91-05d8-0000000006f6 42613 1727204626.63512: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000006f6 42613 1727204626.63515: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009866", "end": "2024-09-24 15:03:46.579004", "rc": 0, "start": "2024-09-24 15:03:46.569138" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2671sec preferred_lft 2671sec inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 42613 1727204626.63659: no more pending results, returning what we have 42613 1727204626.63664: results queue empty 42613 1727204626.63769: checking for any_errors_fatal 42613 1727204626.63777: done checking for any_errors_fatal 42613 1727204626.63778: checking for max_fail_percentage 42613 1727204626.63782: done checking for max_fail_percentage 42613 1727204626.63784: checking to see if all hosts have failed and the running result is not ok 42613 1727204626.63785: done checking to see if all hosts have failed 42613 1727204626.63785: getting the remaining hosts for this loop 42613 1727204626.63787: done getting the remaining hosts for this loop 42613 1727204626.63793: getting the next task for host managed-node3 42613 1727204626.63800: done getting next task for host managed-node3 42613 1727204626.63803: ^ task is: TASK: Verify DNS and network connectivity 42613 1727204626.63806: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204626.63810: getting variables 42613 1727204626.63812: in VariableManager get_vars() 42613 1727204626.63845: Calling all_inventory to load vars for managed-node3 42613 1727204626.63848: Calling groups_inventory to load vars for managed-node3 42613 1727204626.63853: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.63986: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.63992: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.63997: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.66163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.68694: done with get_vars() 42613 1727204626.68739: done getting variables 42613 1727204626.68813: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.452) 0:00:55.296 ***** 42613 1727204626.68850: entering _queue_task() for managed-node3/shell 42613 1727204626.69263: worker is 1 (out of 1 available) 42613 1727204626.69280: exiting _queue_task() for managed-node3/shell 42613 1727204626.69294: done queuing things up, now waiting for results queue to drain 42613 1727204626.69296: waiting for pending results... 42613 1727204626.69690: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 42613 1727204626.69789: in run() - task 127b8e07-fff9-2f91-05d8-0000000006f7 42613 1727204626.69809: variable 'ansible_search_path' from source: unknown 42613 1727204626.69816: variable 'ansible_search_path' from source: unknown 42613 1727204626.69861: calling self._execute() 42613 1727204626.69975: variable 'ansible_host' from source: host vars for 'managed-node3' 42613 1727204626.69988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 42613 1727204626.70006: variable 'omit' from source: magic vars 42613 1727204626.70448: variable 'ansible_distribution_major_version' from source: facts 42613 1727204626.70467: Evaluated conditional (ansible_distribution_major_version != '6'): True 42613 1727204626.70627: variable 'ansible_facts' from source: unknown 42613 1727204626.71761: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 42613 1727204626.71781: when evaluation is False, skipping this task 42613 1727204626.71798: _execute() done 42613 1727204626.71850: dumping result to json 42613 1727204626.71853: done dumping result, returning 42613 1727204626.71856: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [127b8e07-fff9-2f91-05d8-0000000006f7] 42613 1727204626.71858: sending task result for task 127b8e07-fff9-2f91-05d8-0000000006f7 42613 1727204626.72178: done sending task result for task 127b8e07-fff9-2f91-05d8-0000000006f7 42613 1727204626.72182: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 42613 1727204626.72237: no more pending results, returning what we have 42613 1727204626.72244: results queue empty 42613 1727204626.72245: checking for any_errors_fatal 42613 1727204626.72259: done checking for any_errors_fatal 42613 1727204626.72260: checking for max_fail_percentage 42613 1727204626.72262: done checking for max_fail_percentage 42613 1727204626.72263: checking to see if all hosts have failed and the running result is not ok 42613 1727204626.72264: done checking to see if all hosts have failed 42613 1727204626.72267: getting the remaining hosts for this loop 42613 1727204626.72270: done getting the remaining hosts for this loop 42613 1727204626.72275: getting the next task for host managed-node3 42613 1727204626.72286: done getting next task for host managed-node3 42613 1727204626.72289: ^ task is: TASK: meta (flush_handlers) 42613 1727204626.72291: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204626.72296: getting variables 42613 1727204626.72298: in VariableManager get_vars() 42613 1727204626.72333: Calling all_inventory to load vars for managed-node3 42613 1727204626.72337: Calling groups_inventory to load vars for managed-node3 42613 1727204626.72343: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.72358: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.72363: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.72429: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.74159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.75425: done with get_vars() 42613 1727204626.75469: done getting variables 42613 1727204626.75546: in VariableManager get_vars() 42613 1727204626.75557: Calling all_inventory to load vars for managed-node3 42613 1727204626.75559: Calling groups_inventory to load vars for managed-node3 42613 1727204626.75562: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.75569: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.75572: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.75575: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.77008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.78256: done with get_vars() 42613 1727204626.78293: done queuing things up, now waiting for results queue to drain 42613 1727204626.78295: results queue empty 42613 1727204626.78295: checking for any_errors_fatal 42613 1727204626.78297: done checking for any_errors_fatal 42613 1727204626.78298: checking for max_fail_percentage 42613 1727204626.78299: done checking for max_fail_percentage 42613 1727204626.78300: checking to see if all hosts have failed and the running result is not ok 42613 1727204626.78300: done checking to see if all hosts have failed 42613 1727204626.78301: getting the remaining hosts for this loop 42613 1727204626.78301: done getting the remaining hosts for this loop 42613 1727204626.78304: getting the next task for host managed-node3 42613 1727204626.78307: done getting next task for host managed-node3 42613 1727204626.78308: ^ task is: TASK: meta (flush_handlers) 42613 1727204626.78309: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204626.78311: getting variables 42613 1727204626.78312: in VariableManager get_vars() 42613 1727204626.78319: Calling all_inventory to load vars for managed-node3 42613 1727204626.78320: Calling groups_inventory to load vars for managed-node3 42613 1727204626.78322: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.78327: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.78329: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.78331: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.79660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.81268: done with get_vars() 42613 1727204626.81297: done getting variables 42613 1727204626.81340: in VariableManager get_vars() 42613 1727204626.81349: Calling all_inventory to load vars for managed-node3 42613 1727204626.81351: Calling groups_inventory to load vars for managed-node3 42613 1727204626.81353: Calling all_plugins_inventory to load vars for managed-node3 42613 1727204626.81357: Calling all_plugins_play to load vars for managed-node3 42613 1727204626.81358: Calling groups_plugins_inventory to load vars for managed-node3 42613 1727204626.81360: Calling groups_plugins_play to load vars for managed-node3 42613 1727204626.82315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 42613 1727204626.84439: done with get_vars() 42613 1727204626.84487: done queuing things up, now waiting for results queue to drain 42613 1727204626.84489: results queue empty 42613 1727204626.84489: checking for any_errors_fatal 42613 1727204626.84490: done checking for any_errors_fatal 42613 1727204626.84491: checking for max_fail_percentage 42613 1727204626.84492: done checking for max_fail_percentage 42613 1727204626.84492: checking to see if all hosts have failed and the running result is not ok 42613 1727204626.84493: done checking to see if all hosts have failed 42613 1727204626.84493: getting the remaining hosts for this loop 42613 1727204626.84494: done getting the remaining hosts for this loop 42613 1727204626.84502: getting the next task for host managed-node3 42613 1727204626.84504: done getting next task for host managed-node3 42613 1727204626.84505: ^ task is: None 42613 1727204626.84506: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 42613 1727204626.84507: done queuing things up, now waiting for results queue to drain 42613 1727204626.84508: results queue empty 42613 1727204626.84508: checking for any_errors_fatal 42613 1727204626.84508: done checking for any_errors_fatal 42613 1727204626.84509: checking for max_fail_percentage 42613 1727204626.84510: done checking for max_fail_percentage 42613 1727204626.84510: checking to see if all hosts have failed and the running result is not ok 42613 1727204626.84510: done checking to see if all hosts have failed 42613 1727204626.84511: getting the next task for host managed-node3 42613 1727204626.84513: done getting next task for host managed-node3 42613 1727204626.84513: ^ task is: None 42613 1727204626.84514: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=86 changed=5 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.157) 0:00:55.453 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.92s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.79s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.74s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 2.18s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 2.10s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.45s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.38s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.37s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Gathering Facts --------------------------------------------------------- 1.36s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.29s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Gathering Facts --------------------------------------------------------- 1.29s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Create veth interface ethtest0 ------------------------------------------ 1.28s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.24s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather current interface info ------------------------------------------- 1.03s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.99s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.99s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.92s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather the minimum subset of ansible_facts required by the network role test --- 0.92s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 42613 1727204626.84609: RUNNING CLEANUP