41175 1727204632.42286: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 41175 1727204632.42810: Added group all to inventory 41175 1727204632.42812: Added group ungrouped to inventory 41175 1727204632.42817: Group all now contains ungrouped 41175 1727204632.42821: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 41175 1727204632.61893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 41175 1727204632.61944: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 41175 1727204632.61964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 41175 1727204632.62020: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 41175 1727204632.62079: Loaded config def from plugin (inventory/script) 41175 1727204632.62081: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 41175 1727204632.62119: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 41175 1727204632.62190: Loaded config def from plugin (inventory/yaml) 41175 1727204632.62193: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 41175 1727204632.62266: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 41175 1727204632.62622: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 41175 1727204632.62625: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 41175 1727204632.62628: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 41175 1727204632.62634: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 41175 1727204632.62638: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 41175 1727204632.62709: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 41175 1727204632.62778: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 41175 1727204632.62829: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 41175 1727204632.62931: group all already in inventory 41175 1727204632.62939: set inventory_file for managed-node1 41175 1727204632.62943: set inventory_dir for managed-node1 41175 1727204632.62944: Added host managed-node1 to inventory 41175 1727204632.62947: Added host managed-node1 to group all 41175 1727204632.62949: set ansible_host for managed-node1 41175 1727204632.62950: set ansible_ssh_extra_args for managed-node1 41175 1727204632.62953: set inventory_file for managed-node2 41175 1727204632.62957: set inventory_dir for managed-node2 41175 1727204632.62958: Added host managed-node2 to inventory 41175 1727204632.62960: Added host managed-node2 to group all 41175 1727204632.62961: set ansible_host for managed-node2 41175 1727204632.62962: set ansible_ssh_extra_args for managed-node2 41175 1727204632.62965: set inventory_file for managed-node3 41175 1727204632.62968: set inventory_dir for managed-node3 41175 1727204632.62969: Added host managed-node3 to inventory 41175 1727204632.62970: Added host managed-node3 to group all 41175 1727204632.62971: set ansible_host for managed-node3 41175 1727204632.62972: set ansible_ssh_extra_args for managed-node3 41175 1727204632.62975: Reconcile groups and hosts in inventory. 41175 1727204632.62980: Group ungrouped now contains managed-node1 41175 1727204632.62983: Group ungrouped now contains managed-node2 41175 1727204632.62985: Group ungrouped now contains managed-node3 41175 1727204632.63080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 41175 1727204632.63245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 41175 1727204632.63312: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 41175 1727204632.63348: Loaded config def from plugin (vars/host_group_vars) 41175 1727204632.63351: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 41175 1727204632.63360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 41175 1727204632.63369: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 41175 1727204632.63425: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 41175 1727204632.63816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204632.63931: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 41175 1727204632.63986: Loaded config def from plugin (connection/local) 41175 1727204632.63992: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 41175 1727204632.64702: Loaded config def from plugin (connection/paramiko_ssh) 41175 1727204632.64705: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 41175 1727204632.65438: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41175 1727204632.65470: Loaded config def from plugin (connection/psrp) 41175 1727204632.65472: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 41175 1727204632.66069: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41175 1727204632.66120: Loaded config def from plugin (connection/ssh) 41175 1727204632.66124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 41175 1727204632.68605: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41175 1727204632.68638: Loaded config def from plugin (connection/winrm) 41175 1727204632.68641: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 41175 1727204632.68669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 41175 1727204632.68724: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 41175 1727204632.68787: Loaded config def from plugin (shell/cmd) 41175 1727204632.68790: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 41175 1727204632.68812: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 41175 1727204632.68865: Loaded config def from plugin (shell/powershell) 41175 1727204632.68867: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 41175 1727204632.68915: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 41175 1727204632.69061: Loaded config def from plugin (shell/sh) 41175 1727204632.69063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 41175 1727204632.69092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 41175 1727204632.69194: Loaded config def from plugin (become/runas) 41175 1727204632.69196: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 41175 1727204632.69352: Loaded config def from plugin (become/su) 41175 1727204632.69354: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 41175 1727204632.69485: Loaded config def from plugin (become/sudo) 41175 1727204632.69487: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 41175 1727204632.69517: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 41175 1727204632.69793: in VariableManager get_vars() 41175 1727204632.69811: done with get_vars() 41175 1727204632.69922: trying /usr/local/lib/python3.12/site-packages/ansible/modules 41175 1727204632.72654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 41175 1727204632.72746: in VariableManager get_vars() 41175 1727204632.72751: done with get_vars() 41175 1727204632.72753: variable 'playbook_dir' from source: magic vars 41175 1727204632.72754: variable 'ansible_playbook_python' from source: magic vars 41175 1727204632.72755: variable 'ansible_config_file' from source: magic vars 41175 1727204632.72755: variable 'groups' from source: magic vars 41175 1727204632.72756: variable 'omit' from source: magic vars 41175 1727204632.72756: variable 'ansible_version' from source: magic vars 41175 1727204632.72757: variable 'ansible_check_mode' from source: magic vars 41175 1727204632.72758: variable 'ansible_diff_mode' from source: magic vars 41175 1727204632.72758: variable 'ansible_forks' from source: magic vars 41175 1727204632.72759: variable 'ansible_inventory_sources' from source: magic vars 41175 1727204632.72759: variable 'ansible_skip_tags' from source: magic vars 41175 1727204632.72760: variable 'ansible_limit' from source: magic vars 41175 1727204632.72760: variable 'ansible_run_tags' from source: magic vars 41175 1727204632.72761: variable 'ansible_verbosity' from source: magic vars 41175 1727204632.72794: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml 41175 1727204632.73397: in VariableManager get_vars() 41175 1727204632.73410: done with get_vars() 41175 1727204632.73444: in VariableManager get_vars() 41175 1727204632.73455: done with get_vars() 41175 1727204632.73481: in VariableManager get_vars() 41175 1727204632.73492: done with get_vars() 41175 1727204632.73528: in VariableManager get_vars() 41175 1727204632.73540: done with get_vars() 41175 1727204632.73545: variable 'omit' from source: magic vars 41175 1727204632.73560: variable 'omit' from source: magic vars 41175 1727204632.73587: in VariableManager get_vars() 41175 1727204632.73597: done with get_vars() 41175 1727204632.73635: in VariableManager get_vars() 41175 1727204632.73647: done with get_vars() 41175 1727204632.73677: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41175 1727204632.73851: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41175 1727204632.73958: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41175 1727204632.74481: in VariableManager get_vars() 41175 1727204632.74498: done with get_vars() 41175 1727204632.74858: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 41175 1727204632.74973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204632.76637: in VariableManager get_vars() 41175 1727204632.76651: done with get_vars() 41175 1727204632.76654: variable 'omit' from source: magic vars 41175 1727204632.76662: variable 'omit' from source: magic vars 41175 1727204632.76687: in VariableManager get_vars() 41175 1727204632.76701: done with get_vars() 41175 1727204632.76718: in VariableManager get_vars() 41175 1727204632.76731: done with get_vars() 41175 1727204632.76755: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41175 1727204632.76844: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41175 1727204632.76904: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41175 1727204632.78412: in VariableManager get_vars() 41175 1727204632.78432: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204632.80212: in VariableManager get_vars() 41175 1727204632.80215: done with get_vars() 41175 1727204632.80218: variable 'playbook_dir' from source: magic vars 41175 1727204632.80219: variable 'ansible_playbook_python' from source: magic vars 41175 1727204632.80220: variable 'ansible_config_file' from source: magic vars 41175 1727204632.80221: variable 'groups' from source: magic vars 41175 1727204632.80221: variable 'omit' from source: magic vars 41175 1727204632.80222: variable 'ansible_version' from source: magic vars 41175 1727204632.80222: variable 'ansible_check_mode' from source: magic vars 41175 1727204632.80223: variable 'ansible_diff_mode' from source: magic vars 41175 1727204632.80223: variable 'ansible_forks' from source: magic vars 41175 1727204632.80224: variable 'ansible_inventory_sources' from source: magic vars 41175 1727204632.80225: variable 'ansible_skip_tags' from source: magic vars 41175 1727204632.80225: variable 'ansible_limit' from source: magic vars 41175 1727204632.80226: variable 'ansible_run_tags' from source: magic vars 41175 1727204632.80226: variable 'ansible_verbosity' from source: magic vars 41175 1727204632.80253: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 41175 1727204632.80316: in VariableManager get_vars() 41175 1727204632.80319: done with get_vars() 41175 1727204632.80321: variable 'playbook_dir' from source: magic vars 41175 1727204632.80321: variable 'ansible_playbook_python' from source: magic vars 41175 1727204632.80322: variable 'ansible_config_file' from source: magic vars 41175 1727204632.80323: variable 'groups' from source: magic vars 41175 1727204632.80323: variable 'omit' from source: magic vars 41175 1727204632.80324: variable 'ansible_version' from source: magic vars 41175 1727204632.80324: variable 'ansible_check_mode' from source: magic vars 41175 1727204632.80325: variable 'ansible_diff_mode' from source: magic vars 41175 1727204632.80325: variable 'ansible_forks' from source: magic vars 41175 1727204632.80326: variable 'ansible_inventory_sources' from source: magic vars 41175 1727204632.80330: variable 'ansible_skip_tags' from source: magic vars 41175 1727204632.80331: variable 'ansible_limit' from source: magic vars 41175 1727204632.80332: variable 'ansible_run_tags' from source: magic vars 41175 1727204632.80332: variable 'ansible_verbosity' from source: magic vars 41175 1727204632.80359: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 41175 1727204632.80429: in VariableManager get_vars() 41175 1727204632.80439: done with get_vars() 41175 1727204632.80471: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41175 1727204632.80562: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41175 1727204632.80624: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41175 1727204632.80938: in VariableManager get_vars() 41175 1727204632.80955: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204632.82232: in VariableManager get_vars() 41175 1727204632.82242: done with get_vars() 41175 1727204632.82274: in VariableManager get_vars() 41175 1727204632.82276: done with get_vars() 41175 1727204632.82278: variable 'playbook_dir' from source: magic vars 41175 1727204632.82278: variable 'ansible_playbook_python' from source: magic vars 41175 1727204632.82279: variable 'ansible_config_file' from source: magic vars 41175 1727204632.82280: variable 'groups' from source: magic vars 41175 1727204632.82280: variable 'omit' from source: magic vars 41175 1727204632.82281: variable 'ansible_version' from source: magic vars 41175 1727204632.82281: variable 'ansible_check_mode' from source: magic vars 41175 1727204632.82282: variable 'ansible_diff_mode' from source: magic vars 41175 1727204632.82282: variable 'ansible_forks' from source: magic vars 41175 1727204632.82283: variable 'ansible_inventory_sources' from source: magic vars 41175 1727204632.82284: variable 'ansible_skip_tags' from source: magic vars 41175 1727204632.82284: variable 'ansible_limit' from source: magic vars 41175 1727204632.82285: variable 'ansible_run_tags' from source: magic vars 41175 1727204632.82285: variable 'ansible_verbosity' from source: magic vars 41175 1727204632.82312: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 41175 1727204632.82369: in VariableManager get_vars() 41175 1727204632.82380: done with get_vars() 41175 1727204632.82414: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41175 1727204632.82514: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41175 1727204632.82572: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41175 1727204632.82877: in VariableManager get_vars() 41175 1727204632.82894: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204632.84168: in VariableManager get_vars() 41175 1727204632.84178: done with get_vars() 41175 1727204632.84207: in VariableManager get_vars() 41175 1727204632.84216: done with get_vars() 41175 1727204632.84248: in VariableManager get_vars() 41175 1727204632.84258: done with get_vars() 41175 1727204632.84311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 41175 1727204632.84336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 41175 1727204632.84527: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 41175 1727204632.84655: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 41175 1727204632.84658: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 41175 1727204632.84685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 41175 1727204632.84707: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 41175 1727204632.84847: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 41175 1727204632.84903: Loaded config def from plugin (callback/default) 41175 1727204632.84905: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41175 1727204632.85858: Loaded config def from plugin (callback/junit) 41175 1727204632.85861: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41175 1727204632.85900: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 41175 1727204632.85954: Loaded config def from plugin (callback/minimal) 41175 1727204632.85955: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41175 1727204632.86010: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41175 1727204632.86064: Loaded config def from plugin (callback/tree) 41175 1727204632.86066: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 41175 1727204632.86168: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 41175 1727204632.86170: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_table_nm.yml ********************************************* 6 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 41175 1727204632.86197: in VariableManager get_vars() 41175 1727204632.86207: done with get_vars() 41175 1727204632.86212: in VariableManager get_vars() 41175 1727204632.86220: done with get_vars() 41175 1727204632.86223: variable 'omit' from source: magic vars 41175 1727204632.86251: in VariableManager get_vars() 41175 1727204632.86261: done with get_vars() 41175 1727204632.86276: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_table.yml' with nm as provider] ****** 41175 1727204632.86775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 41175 1727204632.86837: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 41175 1727204632.86865: getting the remaining hosts for this loop 41175 1727204632.86866: done getting the remaining hosts for this loop 41175 1727204632.86869: getting the next task for host managed-node3 41175 1727204632.86871: done getting next task for host managed-node3 41175 1727204632.86873: ^ task is: TASK: Gathering Facts 41175 1727204632.86874: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204632.86876: getting variables 41175 1727204632.86877: in VariableManager get_vars() 41175 1727204632.86884: Calling all_inventory to load vars for managed-node3 41175 1727204632.86886: Calling groups_inventory to load vars for managed-node3 41175 1727204632.86888: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204632.86900: Calling all_plugins_play to load vars for managed-node3 41175 1727204632.86909: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204632.86912: Calling groups_plugins_play to load vars for managed-node3 41175 1727204632.86940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204632.86984: done with get_vars() 41175 1727204632.86991: done getting variables 41175 1727204632.87056: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.009) 0:00:00.009 ***** 41175 1727204632.87075: entering _queue_task() for managed-node3/gather_facts 41175 1727204632.87076: Creating lock for gather_facts 41175 1727204632.87361: worker is 1 (out of 1 available) 41175 1727204632.87374: exiting _queue_task() for managed-node3/gather_facts 41175 1727204632.87386: done queuing things up, now waiting for results queue to drain 41175 1727204632.87388: waiting for pending results... 41175 1727204632.87535: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41175 1727204632.87597: in run() - task 12b410aa-8751-f070-39c4-0000000000f5 41175 1727204632.87611: variable 'ansible_search_path' from source: unknown 41175 1727204632.87644: calling self._execute() 41175 1727204632.87699: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204632.87706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204632.87715: variable 'omit' from source: magic vars 41175 1727204632.87800: variable 'omit' from source: magic vars 41175 1727204632.87824: variable 'omit' from source: magic vars 41175 1727204632.87857: variable 'omit' from source: magic vars 41175 1727204632.87900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204632.87932: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204632.87953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204632.87969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204632.87982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204632.88010: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204632.88013: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204632.88020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204632.88107: Set connection var ansible_shell_executable to /bin/sh 41175 1727204632.88111: Set connection var ansible_shell_type to sh 41175 1727204632.88120: Set connection var ansible_pipelining to False 41175 1727204632.88127: Set connection var ansible_timeout to 10 41175 1727204632.88134: Set connection var ansible_connection to ssh 41175 1727204632.88140: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204632.88163: variable 'ansible_shell_executable' from source: unknown 41175 1727204632.88166: variable 'ansible_connection' from source: unknown 41175 1727204632.88169: variable 'ansible_module_compression' from source: unknown 41175 1727204632.88172: variable 'ansible_shell_type' from source: unknown 41175 1727204632.88180: variable 'ansible_shell_executable' from source: unknown 41175 1727204632.88183: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204632.88185: variable 'ansible_pipelining' from source: unknown 41175 1727204632.88188: variable 'ansible_timeout' from source: unknown 41175 1727204632.88196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204632.88350: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204632.88360: variable 'omit' from source: magic vars 41175 1727204632.88367: starting attempt loop 41175 1727204632.88370: running the handler 41175 1727204632.88384: variable 'ansible_facts' from source: unknown 41175 1727204632.88404: _low_level_execute_command(): starting 41175 1727204632.88413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204632.88968: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204632.88974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204632.88977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204632.88980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204632.89028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204632.89032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204632.89042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204632.89103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204632.90855: stdout chunk (state=3): >>>/root <<< 41175 1727204632.90965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204632.91019: stderr chunk (state=3): >>><<< 41175 1727204632.91028: stdout chunk (state=3): >>><<< 41175 1727204632.91049: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204632.91061: _low_level_execute_command(): starting 41175 1727204632.91067: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977 `" && echo ansible-tmp-1727204632.910493-41189-240074217015977="` echo /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977 `" ) && sleep 0' 41175 1727204632.91533: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204632.91537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204632.91540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204632.91542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204632.91552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204632.91598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204632.91601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204632.91644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204632.93632: stdout chunk (state=3): >>>ansible-tmp-1727204632.910493-41189-240074217015977=/root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977 <<< 41175 1727204632.93754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204632.93802: stderr chunk (state=3): >>><<< 41175 1727204632.93806: stdout chunk (state=3): >>><<< 41175 1727204632.93823: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204632.910493-41189-240074217015977=/root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204632.93853: variable 'ansible_module_compression' from source: unknown 41175 1727204632.93902: ANSIBALLZ: Using generic lock for ansible.legacy.setup 41175 1727204632.93906: ANSIBALLZ: Acquiring lock 41175 1727204632.93909: ANSIBALLZ: Lock acquired: 140088839296144 41175 1727204632.93914: ANSIBALLZ: Creating module 41175 1727204633.17855: ANSIBALLZ: Writing module into payload 41175 1727204633.17978: ANSIBALLZ: Writing module 41175 1727204633.18013: ANSIBALLZ: Renaming module 41175 1727204633.18016: ANSIBALLZ: Done creating module 41175 1727204633.18049: variable 'ansible_facts' from source: unknown 41175 1727204633.18056: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204633.18065: _low_level_execute_command(): starting 41175 1727204633.18071: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 41175 1727204633.18576: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204633.18580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.18583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204633.18585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204633.18587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.18649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204633.18653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204633.18702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204633.20467: stdout chunk (state=3): >>>PLATFORM <<< 41175 1727204633.20545: stdout chunk (state=3): >>>Linux <<< 41175 1727204633.20571: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 41175 1727204633.20587: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 41175 1727204633.20728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204633.20803: stderr chunk (state=3): >>><<< 41175 1727204633.20806: stdout chunk (state=3): >>><<< 41175 1727204633.20825: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204633.20836 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 41175 1727204633.20875: _low_level_execute_command(): starting 41175 1727204633.20880: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 41175 1727204633.20973: Sending initial data 41175 1727204633.20976: Sent initial data (1181 bytes) 41175 1727204633.21371: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204633.21375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.21378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204633.21380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.21446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204633.21449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204633.21452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204633.21480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204633.25180: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 41175 1727204633.25588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204633.25655: stderr chunk (state=3): >>><<< 41175 1727204633.25659: stdout chunk (state=3): >>><<< 41175 1727204633.25674: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204633.25752: variable 'ansible_facts' from source: unknown 41175 1727204633.25756: variable 'ansible_facts' from source: unknown 41175 1727204633.25765: variable 'ansible_module_compression' from source: unknown 41175 1727204633.25805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41175 1727204633.25835: variable 'ansible_facts' from source: unknown 41175 1727204633.26139: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/AnsiballZ_setup.py 41175 1727204633.26525: Sending initial data 41175 1727204633.26528: Sent initial data (153 bytes) 41175 1727204633.26881: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204633.26902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204633.26923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204633.26947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204633.26967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204633.26981: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204633.27001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.27025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204633.27125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204633.27150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204633.27229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204633.28923: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204633.28995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204633.29032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpzbo0qm3p /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/AnsiballZ_setup.py <<< 41175 1727204633.29058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/AnsiballZ_setup.py" <<< 41175 1727204633.29091: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpzbo0qm3p" to remote "/root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/AnsiballZ_setup.py" <<< 41175 1727204633.31005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204633.31027: stderr chunk (state=3): >>><<< 41175 1727204633.31034: stdout chunk (state=3): >>><<< 41175 1727204633.31178: done transferring module to remote 41175 1727204633.31208: _low_level_execute_command(): starting 41175 1727204633.31234: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/ /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/AnsiballZ_setup.py && sleep 0' 41175 1727204633.31799: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204633.31814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.31819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204633.31822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204633.31835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.31875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204633.31879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204633.31929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204633.33799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204633.33858: stderr chunk (state=3): >>><<< 41175 1727204633.33862: stdout chunk (state=3): >>><<< 41175 1727204633.33876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204633.33879: _low_level_execute_command(): starting 41175 1727204633.33886: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/AnsiballZ_setup.py && sleep 0' 41175 1727204633.34356: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204633.34360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.34362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204633.34364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204633.34427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204633.34431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204633.34467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204633.36639: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 41175 1727204633.36679: stdout chunk (state=3): >>>import _imp # builtin <<< 41175 1727204633.36712: stdout chunk (state=3): >>>import '_thread' # <<< 41175 1727204633.36716: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 41175 1727204633.36792: stdout chunk (state=3): >>>import '_io' # <<< 41175 1727204633.36801: stdout chunk (state=3): >>>import 'marshal' # <<< 41175 1727204633.36830: stdout chunk (state=3): >>>import 'posix' # <<< 41175 1727204633.36865: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 41175 1727204633.36905: stdout chunk (state=3): >>>import 'time' # <<< 41175 1727204633.36909: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 41175 1727204633.36962: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 41175 1727204633.36968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.36985: stdout chunk (state=3): >>>import '_codecs' # <<< 41175 1727204633.37013: stdout chunk (state=3): >>>import 'codecs' # <<< 41175 1727204633.37056: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41175 1727204633.37080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 41175 1727204633.37084: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802adb44d0> <<< 41175 1727204633.37116: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ad83ad0> <<< 41175 1727204633.37123: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41175 1727204633.37143: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802adb6a20> <<< 41175 1727204633.37161: stdout chunk (state=3): >>>import '_signal' # <<< 41175 1727204633.37185: stdout chunk (state=3): >>>import '_abc' # <<< 41175 1727204633.37192: stdout chunk (state=3): >>>import 'abc' # <<< 41175 1727204633.37207: stdout chunk (state=3): >>>import 'io' # <<< 41175 1727204633.37246: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 41175 1727204633.37339: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41175 1727204633.37376: stdout chunk (state=3): >>>import 'genericpath' # <<< 41175 1727204633.37382: stdout chunk (state=3): >>>import 'posixpath' # <<< 41175 1727204633.37399: stdout chunk (state=3): >>>import 'os' # <<< 41175 1727204633.37426: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 41175 1727204633.37441: stdout chunk (state=3): >>>Processing user site-packages <<< 41175 1727204633.37456: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 41175 1727204633.37471: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 41175 1727204633.37474: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 41175 1727204633.37494: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 41175 1727204633.37507: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 41175 1727204633.37513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 41175 1727204633.37533: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ab650a0> <<< 41175 1727204633.37604: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 41175 1727204633.37620: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.37623: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ab65fd0> <<< 41175 1727204633.37652: stdout chunk (state=3): >>>import 'site' # <<< 41175 1727204633.37691: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41175 1727204633.38084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 41175 1727204633.38105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 41175 1727204633.38130: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 41175 1727204633.38137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.38157: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 41175 1727204633.38200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 41175 1727204633.38219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 41175 1727204633.38253: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 41175 1727204633.38259: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba3dd0> <<< 41175 1727204633.38282: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 41175 1727204633.38299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 41175 1727204633.38324: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba3fe0> <<< 41175 1727204633.38355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 41175 1727204633.38377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 41175 1727204633.38406: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41175 1727204633.38455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.38476: stdout chunk (state=3): >>>import 'itertools' # <<< 41175 1727204633.38505: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 41175 1727204633.38515: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abdb800> <<< 41175 1727204633.38544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 41175 1727204633.38547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 41175 1727204633.38558: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abdbe90> <<< 41175 1727204633.38570: stdout chunk (state=3): >>>import '_collections' # <<< 41175 1727204633.38622: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abbbaa0> <<< 41175 1727204633.38630: stdout chunk (state=3): >>>import '_functools' # <<< 41175 1727204633.38663: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abb91c0> <<< 41175 1727204633.38759: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba0f80> <<< 41175 1727204633.38794: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41175 1727204633.38812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41175 1727204633.38826: stdout chunk (state=3): >>>import '_sre' # <<< 41175 1727204633.38851: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 41175 1727204633.38875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 41175 1727204633.38914: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 41175 1727204633.38932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 41175 1727204633.38954: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abff6e0> <<< 41175 1727204633.38973: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abfe300> <<< 41175 1727204633.38997: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abba1b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abfcbf0> <<< 41175 1727204633.39057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 41175 1727204633.39068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac30710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba0200> <<< 41175 1727204633.39097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41175 1727204633.39140: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac30bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac30a70> <<< 41175 1727204633.39191: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.39194: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac30e60> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ab9ed20> <<< 41175 1727204633.39224: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.39254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 41175 1727204633.39288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 41175 1727204633.39309: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac31520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac311f0> <<< 41175 1727204633.39329: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 41175 1727204633.39340: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 41175 1727204633.39380: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac32420> import 'importlib.util' # <<< 41175 1727204633.39411: stdout chunk (state=3): >>>import 'runpy' # <<< 41175 1727204633.39415: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 41175 1727204633.39447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 41175 1727204633.39493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 41175 1727204633.39504: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4c650> import 'errno' # <<< 41175 1727204633.39538: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.39555: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac4dd90> <<< 41175 1727204633.39590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 41175 1727204633.39624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 41175 1727204633.39627: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4ec90> <<< 41175 1727204633.39677: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac4f2f0> <<< 41175 1727204633.39680: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4e1e0> <<< 41175 1727204633.39712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 41175 1727204633.39761: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.39777: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac4fd70> <<< 41175 1727204633.39787: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4f4a0> <<< 41175 1727204633.39826: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac32480> <<< 41175 1727204633.39843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 41175 1727204633.39870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 41175 1727204633.39893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 41175 1727204633.39921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 41175 1727204633.39952: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a95fce0> <<< 41175 1727204633.39981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 41175 1727204633.40004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41175 1727204633.40030: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a9887d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a988530> <<< 41175 1727204633.40081: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a988800> <<< 41175 1727204633.40084: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a9889e0> <<< 41175 1727204633.40106: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a95de80> <<< 41175 1727204633.40117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 41175 1727204633.40216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 41175 1727204633.40265: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 41175 1727204633.40279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a98a0f0> <<< 41175 1727204633.40308: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a988d70> <<< 41175 1727204633.40311: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac32b70> <<< 41175 1727204633.40334: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41175 1727204633.40404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.40414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 41175 1727204633.40454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 41175 1727204633.40496: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9b24b0> <<< 41175 1727204633.40544: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 41175 1727204633.40556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.40577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 41175 1727204633.40606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 41175 1727204633.40648: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9ce600> <<< 41175 1727204633.40673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 41175 1727204633.40715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41175 1727204633.40774: stdout chunk (state=3): >>>import 'ntpath' # <<< 41175 1727204633.40802: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aa033b0> <<< 41175 1727204633.40832: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 41175 1727204633.40864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 41175 1727204633.40895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 41175 1727204633.40938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41175 1727204633.41032: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aa29b20> <<< 41175 1727204633.41105: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aa034d0> <<< 41175 1727204633.41155: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9cf290> <<< 41175 1727204633.41183: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 41175 1727204633.41192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8484d0> <<< 41175 1727204633.41209: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9cd640> <<< 41175 1727204633.41215: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a98b050> <<< 41175 1727204633.41376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41175 1727204633.41401: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f802a848770> <<< 41175 1727204633.41582: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_g2eyos1b/ansible_ansible.legacy.setup_payload.zip' <<< 41175 1727204633.41587: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.41733: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.41768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 41175 1727204633.41771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41175 1727204633.41823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 41175 1727204633.41898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 41175 1727204633.41935: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8b2210> <<< 41175 1727204633.41943: stdout chunk (state=3): >>>import '_typing' # <<< 41175 1727204633.42148: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a889100> <<< 41175 1727204633.42151: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a888260> <<< 41175 1727204633.42160: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.42192: stdout chunk (state=3): >>>import 'ansible' # <<< 41175 1727204633.42199: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.42215: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.42239: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.42251: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 41175 1727204633.42261: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.43842: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.45129: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 41175 1727204633.45143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a88b620> <<< 41175 1727204633.45162: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.45195: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 41175 1727204633.45201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 41175 1727204633.45226: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41175 1727204633.45265: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.45269: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a8e5ca0> <<< 41175 1727204633.45302: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e5a30> <<< 41175 1727204633.45332: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e5340> <<< 41175 1727204633.45358: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 41175 1727204633.45366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 41175 1727204633.45406: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e5790> <<< 41175 1727204633.45413: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8b2ea0> import 'atexit' # <<< 41175 1727204633.45443: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a8e6a20> <<< 41175 1727204633.45474: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a8e6c60> <<< 41175 1727204633.45500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 41175 1727204633.45539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41175 1727204633.45561: stdout chunk (state=3): >>>import '_locale' # <<< 41175 1727204633.45608: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e7170> <<< 41175 1727204633.45614: stdout chunk (state=3): >>>import 'pwd' # <<< 41175 1727204633.45632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 41175 1727204633.45663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 41175 1727204633.45702: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a748f20> <<< 41175 1727204633.45728: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a74ab40> <<< 41175 1727204633.45756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 41175 1727204633.45768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41175 1727204633.45816: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74b4a0> <<< 41175 1727204633.45826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 41175 1727204633.45861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 41175 1727204633.45877: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74c680> <<< 41175 1727204633.45904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 41175 1727204633.45933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 41175 1727204633.45957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 41175 1727204633.46024: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74f140> <<< 41175 1727204633.46054: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.46077: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a74f260> <<< 41175 1727204633.46083: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74d400> <<< 41175 1727204633.46110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41175 1727204633.46139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 41175 1727204633.46156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 41175 1727204633.46181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 41175 1727204633.46209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 41175 1727204633.46233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 41175 1727204633.46257: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a752f90> <<< 41175 1727204633.46262: stdout chunk (state=3): >>>import '_tokenize' # <<< 41175 1727204633.46332: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a751a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7517c0> <<< 41175 1727204633.46361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 41175 1727204633.46367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 41175 1727204633.46441: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a753ef0> <<< 41175 1727204633.46471: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74d910> <<< 41175 1727204633.46499: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a797110> <<< 41175 1727204633.46530: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 41175 1727204633.46536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7972c0> <<< 41175 1727204633.46561: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 41175 1727204633.46578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 41175 1727204633.46599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 41175 1727204633.46642: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a79ce90> <<< 41175 1727204633.46645: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a79cc50> <<< 41175 1727204633.46662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 41175 1727204633.46776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 41175 1727204633.46826: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.46832: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a79f3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a79d580> <<< 41175 1727204633.46858: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41175 1727204633.46903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.46928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 41175 1727204633.46943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 41175 1727204633.46950: stdout chunk (state=3): >>>import '_string' # <<< 41175 1727204633.46999: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7a6b10> <<< 41175 1727204633.47153: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a79f4a0> <<< 41175 1727204633.47232: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.47238: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7a78f0> <<< 41175 1727204633.47265: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.47271: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7a7770> <<< 41175 1727204633.47325: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.47329: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7a7e60> <<< 41175 1727204633.47342: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a797590> <<< 41175 1727204633.47363: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 41175 1727204633.47391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 41175 1727204633.47411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 41175 1727204633.47444: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.47474: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.47480: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7ab5c0> <<< 41175 1727204633.47667: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.47675: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7ac680> <<< 41175 1727204633.47698: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7a9d30> <<< 41175 1727204633.47728: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.47738: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7ab0e0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7a9970> <<< 41175 1727204633.47759: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.47772: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.47783: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 41175 1727204633.47795: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.47899: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.48015: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.48018: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 41175 1727204633.48042: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.48066: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.48070: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 41175 1727204633.48081: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.48222: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.48360: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.49032: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.49719: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 41175 1727204633.49734: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 41175 1727204633.49746: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 41175 1727204633.49758: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 41175 1727204633.49780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.49838: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a634740> <<< 41175 1727204633.49949: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 41175 1727204633.49953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 41175 1727204633.49971: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a635520> <<< 41175 1727204633.49990: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7aef90> <<< 41175 1727204633.50036: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 41175 1727204633.50054: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.50075: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.50102: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 41175 1727204633.50108: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.50288: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.50477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 41175 1727204633.50493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 41175 1727204633.50504: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6354c0> <<< 41175 1727204633.50511: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.51073: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.51624: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.51707: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.51799: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 41175 1727204633.51806: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.51850: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.51896: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 41175 1727204633.51902: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.51985: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.52103: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 41175 1727204633.52120: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.52127: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 41175 1727204633.52152: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.52195: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.52236: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41175 1727204633.52249: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.52528: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.52806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 41175 1727204633.52870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41175 1727204633.52888: stdout chunk (state=3): >>>import '_ast' # <<< 41175 1727204633.52975: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a637c20> <<< 41175 1727204633.52988: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53073: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53152: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 41175 1727204633.53184: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 41175 1727204633.53202: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41175 1727204633.53292: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.53420: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a63e0c0> <<< 41175 1727204633.53470: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.53477: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a63ea20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a636d80> <<< 41175 1727204633.53503: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53545: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53591: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41175 1727204633.53598: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53644: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53692: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53753: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.53826: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41175 1727204633.53866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.53957: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.53963: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a63d790> <<< 41175 1727204633.54004: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a63ec00> <<< 41175 1727204633.54030: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 41175 1727204633.54052: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54117: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54189: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54214: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54263: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.54291: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 41175 1727204633.54309: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 41175 1727204633.54334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41175 1727204633.54390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 41175 1727204633.54410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 41175 1727204633.54425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 41175 1727204633.54489: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d6e40> <<< 41175 1727204633.54532: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a648b90> <<< 41175 1727204633.54620: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a642c60> <<< 41175 1727204633.54624: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a642ab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 41175 1727204633.54640: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54663: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54698: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 41175 1727204633.54754: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 41175 1727204633.54771: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54788: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 41175 1727204633.54805: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54865: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54937: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54951: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.54978: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55025: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55066: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55107: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55143: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 41175 1727204633.55158: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55242: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55319: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55349: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55386: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 41175 1727204633.55395: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55594: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55785: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55825: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.55886: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.55912: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 41175 1727204633.55931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 41175 1727204633.55947: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 41175 1727204633.55969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 41175 1727204633.56000: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d9cd0> <<< 41175 1727204633.56020: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 41175 1727204633.56031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 41175 1727204633.56050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 41175 1727204633.56102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 41175 1727204633.56126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 41175 1727204633.56138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 41175 1727204633.56143: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4c4a0> <<< 41175 1727204633.56183: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.56195: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029b4c7a0> <<< 41175 1727204633.56244: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6b94f0> <<< 41175 1727204633.56264: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6b87d0> <<< 41175 1727204633.56304: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d83b0> <<< 41175 1727204633.56320: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d8080> <<< 41175 1727204633.56332: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 41175 1727204633.56377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 41175 1727204633.56404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 41175 1727204633.56410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 41175 1727204633.56437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 41175 1727204633.56442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 41175 1727204633.56476: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029b4f770> <<< 41175 1727204633.56492: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4f050> <<< 41175 1727204633.56518: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029b4f200> <<< 41175 1727204633.56531: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4e480> <<< 41175 1727204633.56555: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 41175 1727204633.56668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 41175 1727204633.56676: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4f890> <<< 41175 1727204633.56694: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 41175 1727204633.56729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 41175 1727204633.56762: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.56768: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029bb6330> <<< 41175 1727204633.56793: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4e4e0> <<< 41175 1727204633.56821: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d9460> import 'ansible.module_utils.facts.timeout' # <<< 41175 1727204633.56845: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 41175 1727204633.56862: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.56883: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 41175 1727204633.56899: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.56957: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 41175 1727204633.57034: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57096: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 41175 1727204633.57162: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57175: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 41175 1727204633.57194: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57228: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 41175 1727204633.57267: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57326: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57379: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 41175 1727204633.57392: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57431: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57481: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 41175 1727204633.57485: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57551: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57609: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57676: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.57739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 41175 1727204633.57751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 41175 1727204633.57757: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.58316: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.58820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 41175 1727204633.58833: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.58891: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.58948: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.58985: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59023: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 41175 1727204633.59040: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59066: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 41175 1727204633.59110: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59172: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 41175 1727204633.59252: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59277: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59313: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 41175 1727204633.59323: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59355: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59391: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 41175 1727204633.59397: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59478: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59572: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 41175 1727204633.59579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 41175 1727204633.59609: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029bb7e00> <<< 41175 1727204633.59626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 41175 1727204633.59657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 41175 1727204633.59784: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029bb7050> <<< 41175 1727204633.59796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 41175 1727204633.59804: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59885: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.59938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 41175 1727204633.59955: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60046: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 41175 1727204633.60158: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60235: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 41175 1727204633.60323: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60366: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 41175 1727204633.60464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 41175 1727204633.60538: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.60610: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029be65d0> <<< 41175 1727204633.60821: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029bd3200> import 'ansible.module_utils.facts.system.python' # <<< 41175 1727204633.60832: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60899: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.60956: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 41175 1727204633.60959: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61058: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61145: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61280: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61436: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 41175 1727204633.61443: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 41175 1727204633.61492: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 41175 1727204633.61556: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61581: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 41175 1727204633.61673: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.61693: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.61705: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029a01df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029be5dc0> import 'ansible.module_utils.facts.system.user' # <<< 41175 1727204633.61733: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61740: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 41175 1727204633.61756: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61800: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.61882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 41175 1727204633.62033: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 41175 1727204633.62221: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62321: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62467: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62503: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 41175 1727204633.62572: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62608: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62745: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.62925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 41175 1727204633.62956: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 41175 1727204633.63054: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.63202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 41175 1727204633.63231: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41175 1727204633.63275: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.64075: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.64523: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 41175 1727204633.64529: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.64611: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.64747: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 41175 1727204633.64753: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.64852: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.64974: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 41175 1727204633.64999: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.65223: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.65426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 41175 1727204633.65437: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 41175 1727204633.65441: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.65447: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.65468: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 41175 1727204633.65480: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.65574: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.65705: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.65920: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 41175 1727204633.66173: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66192: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 41175 1727204633.66248: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66272: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 41175 1727204633.66307: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66381: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 41175 1727204633.66485: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66515: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 41175 1727204633.66592: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66659: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 41175 1727204633.66663: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66729: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.66797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 41175 1727204633.66804: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67101: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 41175 1727204633.67406: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67471: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 41175 1727204633.67542: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67581: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67622: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 41175 1727204633.67628: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67663: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 41175 1727204633.67712: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67744: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 41175 1727204633.67793: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67880: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.67972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 41175 1727204633.67994: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68009: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 41175 1727204633.68021: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68062: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 41175 1727204633.68130: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68140: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68167: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68217: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68271: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68346: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 41175 1727204633.68438: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 41175 1727204633.68450: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68512: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68562: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 41175 1727204633.68570: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.68797: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 41175 1727204633.69028: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69073: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69130: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 41175 1727204633.69136: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69187: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69241: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 41175 1727204633.69250: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69339: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 41175 1727204633.69445: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69538: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.69641: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 41175 1727204633.69727: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204633.70292: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 41175 1727204633.70302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 41175 1727204633.70317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 41175 1727204633.70334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 41175 1727204633.70371: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204633.70383: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029a2b770> <<< 41175 1727204633.70387: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a2b320> <<< 41175 1727204633.70437: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a28470> <<< 41175 1727204633.85365: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 41175 1727204633.85392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 41175 1727204633.85438: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a70830> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 41175 1727204633.85455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 41175 1727204633.85474: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a71910> <<< 41175 1727204633.85523: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204633.85580: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 41175 1727204633.85584: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a73b60> <<< 41175 1727204633.85605: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a72c90> <<< 41175 1727204633.85901: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 41175 1727204634.10229: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m<<< 41175 1727204634.10285: stdout chunk (state=3): >>>": 0.91162109375, "5m": 0.8232421875, "15m": 0.50830078125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "53", "epoch": "1727204633", "epoch_int": "1727204633", "date": "2024-09-24", "time": "15:03:53", "iso8601_micro": "2024-09-24T19:03:53.706148Z", "iso8601": "2024-09-24T19:03:53Z", "iso8601_basic": "20240924T150353706148", "iso8601_basic_short": "20240924T150353", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2844, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 873, "free": 2844}, "nocache": {"free": 3474, "used": 243}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1138, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148922880, "block_size": 4096, "block_total": 64479564, "block_available": 61315655, "block_used": 3163909, "inode_total": 16384000, "inode_available": 16302071, "inode_used": 81929, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41175 1727204634.10915: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 41175 1727204634.10970: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 41175 1727204634.11013: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing <<< 41175 1727204634.11067: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token <<< 41175 1727204634.11096: stdout chunk (state=3): >>># cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux <<< 41175 1727204634.11137: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 41175 1727204634.11212: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other <<< 41175 1727204634.11244: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 41175 1727204634.11604: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41175 1727204634.11646: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 41175 1727204634.11680: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 41175 1727204634.11732: stdout chunk (state=3): >>># destroy ntpath <<< 41175 1727204634.11787: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 41175 1727204634.11808: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 41175 1727204634.11831: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 41175 1727204634.11875: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 41175 1727204634.11886: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 41175 1727204634.11945: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 41175 1727204634.11979: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 41175 1727204634.12047: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl <<< 41175 1727204634.12050: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 41175 1727204634.12092: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 41175 1727204634.12115: stdout chunk (state=3): >>># destroy termios # destroy json <<< 41175 1727204634.12168: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout <<< 41175 1727204634.12179: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 41175 1727204634.12266: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 41175 1727204634.12307: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 41175 1727204634.12387: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 41175 1727204634.12456: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 41175 1727204634.12466: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 41175 1727204634.12499: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41175 1727204634.12654: stdout chunk (state=3): >>># destroy sys.monitoring <<< 41175 1727204634.12688: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 41175 1727204634.12737: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 41175 1727204634.12751: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 41175 1727204634.12798: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 41175 1727204634.12812: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 41175 1727204634.12849: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 41175 1727204634.12960: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 41175 1727204634.12988: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 41175 1727204634.13041: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 <<< 41175 1727204634.13079: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 41175 1727204634.13094: stdout chunk (state=3): >>># clear sys.audit hooks <<< 41175 1727204634.13780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204634.13784: stdout chunk (state=3): >>><<< 41175 1727204634.13786: stderr chunk (state=3): >>><<< 41175 1727204634.13828: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802adb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ad83ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802adb6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ab650a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ab65fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba3dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba3fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abdb800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abdbe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abbbaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abb91c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba0f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abff6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abfe300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abba1b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802abfcbf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac30710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aba0200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac30bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac30a70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac30e60> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ab9ed20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac31520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac311f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac32420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4c650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac4dd90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4ec90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac4f2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4e1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802ac4fd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac4f4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac32480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a95fce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a9887d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a988530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a988800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a9889e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a95de80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a98a0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a988d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802ac32b70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9b24b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9ce600> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aa033b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aa29b20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802aa034d0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9cf290> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8484d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a9cd640> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a98b050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f802a848770> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_g2eyos1b/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8b2210> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a889100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a888260> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a88b620> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a8e5ca0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e5a30> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e5340> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e5790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8b2ea0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a8e6a20> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a8e6c60> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a8e7170> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a748f20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a74ab40> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74b4a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74c680> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74f140> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a74f260> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74d400> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a752f90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a751a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7517c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a753ef0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a74d910> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a797110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7972c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a79ce90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a79cc50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a79f3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a79d580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7a6b10> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a79f4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7a78f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7a7770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7a7e60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a797590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7ab5c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7ac680> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7a9d30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a7ab0e0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7a9970> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a634740> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a635520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a7aef90> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6354c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a637c20> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a63e0c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a63ea20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a636d80> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f802a63d790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a63ec00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d6e40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a648b90> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a642c60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a642ab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d9cd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4c4a0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029b4c7a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6b94f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6b87d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d83b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d8080> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029b4f770> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4f050> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029b4f200> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4e480> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4f890> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029bb6330> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029b4e4e0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f802a6d9460> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029bb7e00> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029bb7050> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029be65d0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029bd3200> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029a01df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029be5dc0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8029a2b770> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a2b320> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a28470> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a70830> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a71910> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a73b60> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8029a72c90> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.91162109375, "5m": 0.8232421875, "15m": 0.50830078125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "53", "epoch": "1727204633", "epoch_int": "1727204633", "date": "2024-09-24", "time": "15:03:53", "iso8601_micro": "2024-09-24T19:03:53.706148Z", "iso8601": "2024-09-24T19:03:53Z", "iso8601_basic": "20240924T150353706148", "iso8601_basic_short": "20240924T150353", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2844, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 873, "free": 2844}, "nocache": {"free": 3474, "used": 243}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1138, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148922880, "block_size": 4096, "block_total": 64479564, "block_available": 61315655, "block_used": 3163909, "inode_total": 16384000, "inode_available": 16302071, "inode_used": 81929, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 41175 1727204634.15635: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204634.15645: _low_level_execute_command(): starting 41175 1727204634.15648: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204632.910493-41189-240074217015977/ > /dev/null 2>&1 && sleep 0' 41175 1727204634.16274: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204634.16304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204634.16323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204634.16342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204634.16412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204634.16472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204634.16503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204634.16551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204634.16596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204634.18646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204634.18650: stdout chunk (state=3): >>><<< 41175 1727204634.18652: stderr chunk (state=3): >>><<< 41175 1727204634.18695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204634.18700: handler run complete 41175 1727204634.18931: variable 'ansible_facts' from source: unknown 41175 1727204634.19108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.19740: variable 'ansible_facts' from source: unknown 41175 1727204634.19815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.20068: attempt loop complete, returning result 41175 1727204634.20080: _execute() done 41175 1727204634.20090: dumping result to json 41175 1727204634.20139: done dumping result, returning 41175 1727204634.20154: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-f070-39c4-0000000000f5] 41175 1727204634.20165: sending task result for task 12b410aa-8751-f070-39c4-0000000000f5 ok: [managed-node3] 41175 1727204634.21568: no more pending results, returning what we have 41175 1727204634.21572: results queue empty 41175 1727204634.21573: checking for any_errors_fatal 41175 1727204634.21575: done checking for any_errors_fatal 41175 1727204634.21576: checking for max_fail_percentage 41175 1727204634.21578: done checking for max_fail_percentage 41175 1727204634.21579: checking to see if all hosts have failed and the running result is not ok 41175 1727204634.21580: done checking to see if all hosts have failed 41175 1727204634.21581: getting the remaining hosts for this loop 41175 1727204634.21584: done getting the remaining hosts for this loop 41175 1727204634.21588: getting the next task for host managed-node3 41175 1727204634.21596: done getting next task for host managed-node3 41175 1727204634.21599: ^ task is: TASK: meta (flush_handlers) 41175 1727204634.21605: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204634.21614: getting variables 41175 1727204634.21616: in VariableManager get_vars() 41175 1727204634.21643: Calling all_inventory to load vars for managed-node3 41175 1727204634.21646: Calling groups_inventory to load vars for managed-node3 41175 1727204634.21650: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204634.21657: done sending task result for task 12b410aa-8751-f070-39c4-0000000000f5 41175 1727204634.21660: WORKER PROCESS EXITING 41175 1727204634.21671: Calling all_plugins_play to load vars for managed-node3 41175 1727204634.21674: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204634.21679: Calling groups_plugins_play to load vars for managed-node3 41175 1727204634.21974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.22320: done with get_vars() 41175 1727204634.22333: done getting variables 41175 1727204634.22426: in VariableManager get_vars() 41175 1727204634.22437: Calling all_inventory to load vars for managed-node3 41175 1727204634.22440: Calling groups_inventory to load vars for managed-node3 41175 1727204634.22443: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204634.22449: Calling all_plugins_play to load vars for managed-node3 41175 1727204634.22452: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204634.22456: Calling groups_plugins_play to load vars for managed-node3 41175 1727204634.22731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.23072: done with get_vars() 41175 1727204634.23091: done queuing things up, now waiting for results queue to drain 41175 1727204634.23094: results queue empty 41175 1727204634.23095: checking for any_errors_fatal 41175 1727204634.23098: done checking for any_errors_fatal 41175 1727204634.23099: checking for max_fail_percentage 41175 1727204634.23101: done checking for max_fail_percentage 41175 1727204634.23107: checking to see if all hosts have failed and the running result is not ok 41175 1727204634.23108: done checking to see if all hosts have failed 41175 1727204634.23109: getting the remaining hosts for this loop 41175 1727204634.23110: done getting the remaining hosts for this loop 41175 1727204634.23113: getting the next task for host managed-node3 41175 1727204634.23121: done getting next task for host managed-node3 41175 1727204634.23124: ^ task is: TASK: Include the task 'el_repo_setup.yml' 41175 1727204634.23126: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204634.23134: getting variables 41175 1727204634.23135: in VariableManager get_vars() 41175 1727204634.23152: Calling all_inventory to load vars for managed-node3 41175 1727204634.23155: Calling groups_inventory to load vars for managed-node3 41175 1727204634.23158: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204634.23164: Calling all_plugins_play to load vars for managed-node3 41175 1727204634.23167: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204634.23171: Calling groups_plugins_play to load vars for managed-node3 41175 1727204634.23411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.23756: done with get_vars() 41175 1727204634.23766: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:11 Tuesday 24 September 2024 15:03:54 -0400 (0:00:01.367) 0:00:01.377 ***** 41175 1727204634.23871: entering _queue_task() for managed-node3/include_tasks 41175 1727204634.23874: Creating lock for include_tasks 41175 1727204634.24344: worker is 1 (out of 1 available) 41175 1727204634.24358: exiting _queue_task() for managed-node3/include_tasks 41175 1727204634.24368: done queuing things up, now waiting for results queue to drain 41175 1727204634.24371: waiting for pending results... 41175 1727204634.24610: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 41175 1727204634.24737: in run() - task 12b410aa-8751-f070-39c4-000000000006 41175 1727204634.24762: variable 'ansible_search_path' from source: unknown 41175 1727204634.24822: calling self._execute() 41175 1727204634.24920: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204634.24936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204634.24954: variable 'omit' from source: magic vars 41175 1727204634.25092: _execute() done 41175 1727204634.25111: dumping result to json 41175 1727204634.25122: done dumping result, returning 41175 1727204634.25134: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-f070-39c4-000000000006] 41175 1727204634.25148: sending task result for task 12b410aa-8751-f070-39c4-000000000006 41175 1727204634.25294: done sending task result for task 12b410aa-8751-f070-39c4-000000000006 41175 1727204634.25297: WORKER PROCESS EXITING 41175 1727204634.25463: no more pending results, returning what we have 41175 1727204634.25469: in VariableManager get_vars() 41175 1727204634.25506: Calling all_inventory to load vars for managed-node3 41175 1727204634.25511: Calling groups_inventory to load vars for managed-node3 41175 1727204634.25515: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204634.25536: Calling all_plugins_play to load vars for managed-node3 41175 1727204634.25540: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204634.25544: Calling groups_plugins_play to load vars for managed-node3 41175 1727204634.26022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.26364: done with get_vars() 41175 1727204634.26373: variable 'ansible_search_path' from source: unknown 41175 1727204634.26388: we have included files to process 41175 1727204634.26392: generating all_blocks data 41175 1727204634.26393: done generating all_blocks data 41175 1727204634.26395: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41175 1727204634.26396: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41175 1727204634.26399: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41175 1727204634.27364: in VariableManager get_vars() 41175 1727204634.27383: done with get_vars() 41175 1727204634.27404: done processing included file 41175 1727204634.27406: iterating over new_blocks loaded from include file 41175 1727204634.27408: in VariableManager get_vars() 41175 1727204634.27423: done with get_vars() 41175 1727204634.27425: filtering new block on tags 41175 1727204634.27443: done filtering new block on tags 41175 1727204634.27447: in VariableManager get_vars() 41175 1727204634.27464: done with get_vars() 41175 1727204634.27466: filtering new block on tags 41175 1727204634.27486: done filtering new block on tags 41175 1727204634.27491: in VariableManager get_vars() 41175 1727204634.27505: done with get_vars() 41175 1727204634.27507: filtering new block on tags 41175 1727204634.27531: done filtering new block on tags 41175 1727204634.27534: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 41175 1727204634.27540: extending task lists for all hosts with included blocks 41175 1727204634.27643: done extending task lists 41175 1727204634.27645: done processing included files 41175 1727204634.27646: results queue empty 41175 1727204634.27647: checking for any_errors_fatal 41175 1727204634.27649: done checking for any_errors_fatal 41175 1727204634.27650: checking for max_fail_percentage 41175 1727204634.27651: done checking for max_fail_percentage 41175 1727204634.27652: checking to see if all hosts have failed and the running result is not ok 41175 1727204634.27653: done checking to see if all hosts have failed 41175 1727204634.27654: getting the remaining hosts for this loop 41175 1727204634.27655: done getting the remaining hosts for this loop 41175 1727204634.27658: getting the next task for host managed-node3 41175 1727204634.27663: done getting next task for host managed-node3 41175 1727204634.27665: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 41175 1727204634.27668: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204634.27670: getting variables 41175 1727204634.27672: in VariableManager get_vars() 41175 1727204634.27686: Calling all_inventory to load vars for managed-node3 41175 1727204634.27691: Calling groups_inventory to load vars for managed-node3 41175 1727204634.27694: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204634.27700: Calling all_plugins_play to load vars for managed-node3 41175 1727204634.27703: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204634.27707: Calling groups_plugins_play to load vars for managed-node3 41175 1727204634.27952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.28294: done with get_vars() 41175 1727204634.28304: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.045) 0:00:01.422 ***** 41175 1727204634.28398: entering _queue_task() for managed-node3/setup 41175 1727204634.28911: worker is 1 (out of 1 available) 41175 1727204634.28925: exiting _queue_task() for managed-node3/setup 41175 1727204634.28935: done queuing things up, now waiting for results queue to drain 41175 1727204634.28937: waiting for pending results... 41175 1727204634.29215: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 41175 1727204634.29223: in run() - task 12b410aa-8751-f070-39c4-000000000106 41175 1727204634.29227: variable 'ansible_search_path' from source: unknown 41175 1727204634.29230: variable 'ansible_search_path' from source: unknown 41175 1727204634.29238: calling self._execute() 41175 1727204634.29337: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204634.29351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204634.29368: variable 'omit' from source: magic vars 41175 1727204634.30103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204634.32897: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204634.32967: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204634.33032: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204634.33082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204634.33127: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204634.33295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204634.33308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204634.33361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204634.33423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204634.33465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204634.33707: variable 'ansible_facts' from source: unknown 41175 1727204634.33898: variable 'network_test_required_facts' from source: task vars 41175 1727204634.33906: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 41175 1727204634.33910: variable 'omit' from source: magic vars 41175 1727204634.33951: variable 'omit' from source: magic vars 41175 1727204634.34020: variable 'omit' from source: magic vars 41175 1727204634.34195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204634.34199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204634.34201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204634.34204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204634.34206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204634.34209: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204634.34211: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204634.34213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204634.34354: Set connection var ansible_shell_executable to /bin/sh 41175 1727204634.34364: Set connection var ansible_shell_type to sh 41175 1727204634.34377: Set connection var ansible_pipelining to False 41175 1727204634.34395: Set connection var ansible_timeout to 10 41175 1727204634.34408: Set connection var ansible_connection to ssh 41175 1727204634.34440: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204634.34470: variable 'ansible_shell_executable' from source: unknown 41175 1727204634.34550: variable 'ansible_connection' from source: unknown 41175 1727204634.34558: variable 'ansible_module_compression' from source: unknown 41175 1727204634.34560: variable 'ansible_shell_type' from source: unknown 41175 1727204634.34563: variable 'ansible_shell_executable' from source: unknown 41175 1727204634.34565: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204634.34567: variable 'ansible_pipelining' from source: unknown 41175 1727204634.34569: variable 'ansible_timeout' from source: unknown 41175 1727204634.34571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204634.34735: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204634.34755: variable 'omit' from source: magic vars 41175 1727204634.34779: starting attempt loop 41175 1727204634.34787: running the handler 41175 1727204634.34809: _low_level_execute_command(): starting 41175 1727204634.34826: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204634.35627: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204634.35657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204634.35673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204634.35773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204634.35815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204634.35837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204634.35876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204634.35949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204634.37719: stdout chunk (state=3): >>>/root <<< 41175 1727204634.38059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204634.38063: stdout chunk (state=3): >>><<< 41175 1727204634.38065: stderr chunk (state=3): >>><<< 41175 1727204634.38069: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204634.38079: _low_level_execute_command(): starting 41175 1727204634.38082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117 `" && echo ansible-tmp-1727204634.3794992-41295-21111145522117="` echo /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117 `" ) && sleep 0' 41175 1727204634.38681: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204634.38707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204634.38726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204634.38757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204634.38777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204634.38875: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204634.38906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204634.38928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204634.38952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204634.39033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204634.41051: stdout chunk (state=3): >>>ansible-tmp-1727204634.3794992-41295-21111145522117=/root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117 <<< 41175 1727204634.41181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204634.41285: stderr chunk (state=3): >>><<< 41175 1727204634.41296: stdout chunk (state=3): >>><<< 41175 1727204634.41321: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204634.3794992-41295-21111145522117=/root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204634.41501: variable 'ansible_module_compression' from source: unknown 41175 1727204634.41505: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41175 1727204634.41507: variable 'ansible_facts' from source: unknown 41175 1727204634.41734: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/AnsiballZ_setup.py 41175 1727204634.41963: Sending initial data 41175 1727204634.41974: Sent initial data (153 bytes) 41175 1727204634.42683: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204634.42723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204634.42737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204634.42835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204634.42880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204634.42906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204634.42980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204634.44613: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204634.44692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204634.44741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmppgtbi3rj /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/AnsiballZ_setup.py <<< 41175 1727204634.44746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/AnsiballZ_setup.py" <<< 41175 1727204634.44777: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmppgtbi3rj" to remote "/root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/AnsiballZ_setup.py" <<< 41175 1727204634.47230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204634.47417: stderr chunk (state=3): >>><<< 41175 1727204634.47420: stdout chunk (state=3): >>><<< 41175 1727204634.47423: done transferring module to remote 41175 1727204634.47425: _low_level_execute_command(): starting 41175 1727204634.47427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/ /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/AnsiballZ_setup.py && sleep 0' 41175 1727204634.48158: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204634.48174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204634.48188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204634.48269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204634.48331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204634.48375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204634.48439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204634.50401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204634.50510: stderr chunk (state=3): >>><<< 41175 1727204634.50520: stdout chunk (state=3): >>><<< 41175 1727204634.50543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204634.50552: _low_level_execute_command(): starting 41175 1727204634.50562: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/AnsiballZ_setup.py && sleep 0' 41175 1727204634.51255: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204634.51307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204634.51395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204634.51432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204634.51466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204634.51525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204634.53810: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 41175 1727204634.53867: stdout chunk (state=3): >>>import _imp # builtin <<< 41175 1727204634.53870: stdout chunk (state=3): >>>import '_thread' # <<< 41175 1727204634.53895: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 41175 1727204634.53957: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 41175 1727204634.53985: stdout chunk (state=3): >>>import 'posix' # <<< 41175 1727204634.54025: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 41175 1727204634.54058: stdout chunk (state=3): >>>import 'time' # <<< 41175 1727204634.54080: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 41175 1727204634.54124: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 41175 1727204634.54141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 41175 1727204634.54168: stdout chunk (state=3): >>>import 'codecs' # <<< 41175 1727204634.54232: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41175 1727204634.54235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 41175 1727204634.54276: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea2d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea2a3ad0> <<< 41175 1727204634.54325: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41175 1727204634.54330: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea2d6a20> <<< 41175 1727204634.54356: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 41175 1727204634.54370: stdout chunk (state=3): >>>import 'io' # <<< 41175 1727204634.54403: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 41175 1727204634.54493: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41175 1727204634.54542: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 41175 1727204634.54591: stdout chunk (state=3): >>>import 'os' # <<< 41175 1727204634.54621: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 41175 1727204634.54655: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 41175 1727204634.54677: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 41175 1727204634.54693: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea0c50a0> <<< 41175 1727204634.54758: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.54787: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea0c5fd0> <<< 41175 1727204634.54807: stdout chunk (state=3): >>>import 'site' # <<< 41175 1727204634.54832: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41175 1727204634.55254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 41175 1727204634.55277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 41175 1727204634.55312: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 41175 1727204634.55328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 41175 1727204634.55378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 41175 1727204634.55381: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 41175 1727204634.55449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea103e90> <<< 41175 1727204634.55452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 41175 1727204634.55500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 41175 1727204634.55505: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea103f50> <<< 41175 1727204634.55540: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 41175 1727204634.55547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 41175 1727204634.55569: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41175 1727204634.55619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.55671: stdout chunk (state=3): >>>import 'itertools' # <<< 41175 1727204634.55675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 41175 1727204634.55727: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea13b860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea13bef0> <<< 41175 1727204634.55730: stdout chunk (state=3): >>>import '_collections' # <<< 41175 1727204634.55783: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea11bb60> <<< 41175 1727204634.55796: stdout chunk (state=3): >>>import '_functools' # <<< 41175 1727204634.55826: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea119280> <<< 41175 1727204634.55943: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea101040> <<< 41175 1727204634.55974: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41175 1727204634.55977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41175 1727204634.56027: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 41175 1727204634.56031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 41175 1727204634.56060: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 41175 1727204634.56099: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea15f740> <<< 41175 1727204634.56137: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea15e360> <<< 41175 1727204634.56156: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea11a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea102f30> <<< 41175 1727204634.56214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 41175 1727204634.56244: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea190740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1002c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41175 1727204634.56285: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.56312: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea190bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea190aa0> <<< 41175 1727204634.56355: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea190e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea0fede0> <<< 41175 1727204634.56388: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.56402: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 41175 1727204634.56441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 41175 1727204634.56491: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea191520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1911f0> import 'importlib.machinery' # <<< 41175 1727204634.56495: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 41175 1727204634.56529: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea192420> <<< 41175 1727204634.56547: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 41175 1727204634.56567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 41175 1727204634.56613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 41175 1727204634.56642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1ac650> <<< 41175 1727204634.56697: stdout chunk (state=3): >>>import 'errno' # <<< 41175 1727204634.56701: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea1add60> <<< 41175 1727204634.56753: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 41175 1727204634.56757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 41175 1727204634.56827: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1aec60> <<< 41175 1727204634.56832: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea1af2c0> <<< 41175 1727204634.56835: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1ae1b0> <<< 41175 1727204634.56851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 41175 1727204634.56904: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.56907: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea1afd40> <<< 41175 1727204634.56964: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1af470> <<< 41175 1727204634.56967: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea192480> <<< 41175 1727204634.57009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 41175 1727204634.57012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 41175 1727204634.57047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 41175 1727204634.57050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 41175 1727204634.57094: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ea3cb0> <<< 41175 1727204634.57114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41175 1727204634.57155: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ecc7a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ecc500> <<< 41175 1727204634.57194: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ecc7d0> <<< 41175 1727204634.57216: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ecc9b0> <<< 41175 1727204634.57243: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ea1e50> <<< 41175 1727204634.57255: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 41175 1727204634.57362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 41175 1727204634.57396: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 41175 1727204634.57436: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ece000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9eccc80> <<< 41175 1727204634.57463: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea192b70> <<< 41175 1727204634.57480: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41175 1727204634.57541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.57558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 41175 1727204634.57599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 41175 1727204634.57632: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9efa3c0> <<< 41175 1727204634.57682: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 41175 1727204634.57730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.57734: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 41175 1727204634.57746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 41175 1727204634.57792: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f12510> <<< 41175 1727204634.57816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 41175 1727204634.57855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41175 1727204634.57932: stdout chunk (state=3): >>>import 'ntpath' # <<< 41175 1727204634.57950: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f4b2f0> <<< 41175 1727204634.57979: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 41175 1727204634.58004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 41175 1727204634.58034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 41175 1727204634.58073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41175 1727204634.58167: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f71a90> <<< 41175 1727204634.58246: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f4b410> <<< 41175 1727204634.58291: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f131a0> <<< 41175 1727204634.58340: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9d94410> <<< 41175 1727204634.58353: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f11550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ecef60> <<< 41175 1727204634.58520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41175 1727204634.58549: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd4e9d946e0> <<< 41175 1727204634.58720: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_ngmk5yfx/ansible_setup_payload.zip' <<< 41175 1727204634.58732: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.58877: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.58920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 41175 1727204634.58924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41175 1727204634.58959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 41175 1727204634.59040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 41175 1727204634.59077: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e021e0> <<< 41175 1727204634.59101: stdout chunk (state=3): >>>import '_typing' # <<< 41175 1727204634.59297: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9dd90d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9dd8230> <<< 41175 1727204634.59306: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.59366: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 41175 1727204634.59370: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.59396: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 41175 1727204634.59409: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.60963: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.62269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 41175 1727204634.62287: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ddb5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.62325: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 41175 1727204634.62379: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41175 1727204634.62384: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9e31bb0> <<< 41175 1727204634.62422: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e31970> <<< 41175 1727204634.62476: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e31280> <<< 41175 1727204634.62481: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 41175 1727204634.62534: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e319d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e02e70> import 'atexit' # <<< 41175 1727204634.62569: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.62603: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9e328d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9e32b10> <<< 41175 1727204634.62634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 41175 1727204634.62669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41175 1727204634.62685: stdout chunk (state=3): >>>import '_locale' # <<< 41175 1727204634.62752: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e33020> <<< 41175 1727204634.62756: stdout chunk (state=3): >>>import 'pwd' # <<< 41175 1727204634.62786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 41175 1727204634.62820: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c98dd0> <<< 41175 1727204634.62865: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.62891: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9c9a9f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 41175 1727204634.62904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41175 1727204634.62947: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9b380> <<< 41175 1727204634.62950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 41175 1727204634.63001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 41175 1727204634.63005: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9c2c0> <<< 41175 1727204634.63027: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 41175 1727204634.63051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 41175 1727204634.63081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 41175 1727204634.63139: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9efc0> <<< 41175 1727204634.63182: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.63237: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9c9f0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9d280> <<< 41175 1727204634.63241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41175 1727204634.63265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 41175 1727204634.63309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 41175 1727204634.63315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 41175 1727204634.63366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 41175 1727204634.63379: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca2f60> import '_tokenize' # <<< 41175 1727204634.63461: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca1a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca1790> <<< 41175 1727204634.63483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 41175 1727204634.63558: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca3f20> <<< 41175 1727204634.63591: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9d790> <<< 41175 1727204634.63620: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ce7140> <<< 41175 1727204634.63665: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ce72c0> <<< 41175 1727204634.63680: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 41175 1727204634.63726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 41175 1727204634.63763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 41175 1727204634.63782: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cece90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cecc50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 41175 1727204634.63916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 41175 1727204634.63955: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.63994: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cef3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ced580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41175 1727204634.64052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.64078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 41175 1727204634.64099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 41175 1727204634.64141: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf6b40> <<< 41175 1727204634.64292: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cef4d0> <<< 41175 1727204634.64378: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.64405: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cf7920> <<< 41175 1727204634.64428: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cf7b90> <<< 41175 1727204634.64470: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cf7ec0> <<< 41175 1727204634.64504: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ce75c0> <<< 41175 1727204634.64548: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 41175 1727204634.64553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 41175 1727204634.64577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 41175 1727204634.64593: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.64629: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cfb5c0> <<< 41175 1727204634.64833: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.64836: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cfc680> <<< 41175 1727204634.64900: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf9d30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cfb0b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf9910> <<< 41175 1727204634.64931: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.64934: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 41175 1727204634.64949: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.65040: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.65175: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.65194: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 41175 1727204634.65226: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 41175 1727204634.65363: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.65505: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.66189: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.66875: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 41175 1727204634.66896: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 41175 1727204634.66941: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 41175 1727204634.66949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.66986: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b84860> <<< 41175 1727204634.67106: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 41175 1727204634.67131: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b85700> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf8230> <<< 41175 1727204634.67206: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 41175 1727204634.67209: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.67255: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 41175 1727204634.67260: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.67426: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.67626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 41175 1727204634.67646: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b85dc0> # zipimport: zlib available <<< 41175 1727204634.68207: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.68759: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.68838: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.68943: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 41175 1727204634.68947: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.68989: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.69036: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 41175 1727204634.69040: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.69116: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.69246: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 41175 1727204634.69273: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41175 1727204634.69291: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 41175 1727204634.69331: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.69381: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41175 1727204634.69385: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.69663: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.69957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 41175 1727204634.70030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41175 1727204634.70033: stdout chunk (state=3): >>>import '_ast' # <<< 41175 1727204634.70222: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b865d0> # zipimport: zlib available # zipimport: zlib available <<< 41175 1727204634.70321: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 41175 1727204634.70351: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41175 1727204634.70447: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.70570: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b8e060> <<< 41175 1727204634.70628: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b8e9f0> <<< 41175 1727204634.70664: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b87440> # zipimport: zlib available <<< 41175 1727204634.70704: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.70749: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41175 1727204634.70769: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.70804: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.70849: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.70940: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.70997: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41175 1727204634.71034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.71128: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b8d8b0> <<< 41175 1727204634.71173: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b8eb40> <<< 41175 1727204634.71229: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 41175 1727204634.71241: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.71298: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.71365: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.71399: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.71448: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.71471: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 41175 1727204634.71519: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 41175 1727204634.71527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41175 1727204634.71580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 41175 1727204634.71610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 41175 1727204634.71627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 41175 1727204634.71678: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c22d20> <<< 41175 1727204634.71731: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b98a70> <<< 41175 1727204634.71828: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b96ba0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b969f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 41175 1727204634.71843: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.71876: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.71902: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 41175 1727204634.71960: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 41175 1727204634.72008: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41175 1727204634.72012: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 41175 1727204634.72025: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72076: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72141: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72171: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72191: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72241: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72281: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72326: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 41175 1727204634.72379: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72466: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72685: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41175 1727204634.72738: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 41175 1727204634.72741: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.72935: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.73130: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.73174: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.73240: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204634.73282: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 41175 1727204634.73286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 41175 1727204634.73330: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 41175 1727204634.73334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 41175 1727204634.73372: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c25a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 41175 1727204634.73404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 41175 1727204634.73424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 41175 1727204634.73446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 41175 1727204634.73481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 41175 1727204634.73512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9148350> <<< 41175 1727204634.73538: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.73567: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e91486b0> <<< 41175 1727204634.73604: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c053d0> <<< 41175 1727204634.73678: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c04620> <<< 41175 1727204634.73683: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c241a0> <<< 41175 1727204634.73686: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c27c20> <<< 41175 1727204634.73710: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 41175 1727204634.73765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 41175 1727204634.73807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 41175 1727204634.73810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 41175 1727204634.73847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 41175 1727204634.73896: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e914b650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e914af00> <<< 41175 1727204634.73901: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.73950: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e914b0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e914a330> <<< 41175 1727204634.73955: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 41175 1727204634.74091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 41175 1727204634.74094: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e914b830> <<< 41175 1727204634.74114: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 41175 1727204634.74132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 41175 1727204634.74157: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e91b2330> <<< 41175 1727204634.74208: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91b0350> <<< 41175 1727204634.74224: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c27e00> import 'ansible.module_utils.facts.timeout' # <<< 41175 1727204634.74253: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 41175 1727204634.74285: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 41175 1727204634.74307: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74365: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74432: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 41175 1727204634.74446: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74503: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 41175 1727204634.74570: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74600: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 41175 1727204634.74634: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 41175 1727204634.74682: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74731: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 41175 1727204634.74797: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74843: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 41175 1727204634.74906: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.74957: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.75024: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.75096: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.75162: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 41175 1727204634.75174: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.75720: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 41175 1727204634.76215: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76268: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76328: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76365: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76421: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 41175 1727204634.76424: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76447: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76487: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 41175 1727204634.76510: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76553: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 41175 1727204634.76635: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76658: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 41175 1727204634.76709: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76743: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76773: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 41175 1727204634.76784: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76864: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.76964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 41175 1727204634.77000: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91b2660> <<< 41175 1727204634.77026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py<<< 41175 1727204634.77056: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 41175 1727204634.77193: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91b3290> import 'ansible.module_utils.facts.system.local' # <<< 41175 1727204634.77208: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77269: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 41175 1727204634.77353: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77442: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 41175 1727204634.77558: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77635: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 41175 1727204634.77734: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77758: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.77814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 41175 1727204634.77863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 41175 1727204634.77934: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204634.78005: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e91e6720> <<< 41175 1727204634.78228: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91cf170> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 41175 1727204634.78309: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.78365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 41175 1727204634.78368: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.78452: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.78544: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.78675: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.78968: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 41175 1727204634.79005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 41175 1727204634.79061: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.79131: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 41175 1727204634.79201: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e8f9e060> <<< 41175 1727204634.79203: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e8f9dd60> import 'ansible.module_utils.facts.system.user' # <<< 41175 1727204634.79235: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41175 1727204634.79238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 41175 1727204634.79298: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.79314: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.79368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 41175 1727204634.79528: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.79649: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.79924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 41175 1727204634.79930: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.80100: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.80267: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.80335: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.80406: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 41175 1727204634.80413: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.80438: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.80513: stdout chunk (state=3): >>># zipimport: zlib available<<< 41175 1727204634.80549: stdout chunk (state=3): >>> <<< 41175 1727204634.80784: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.81062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 41175 1727204634.81084: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 41175 1727204634.81092: stdout chunk (state=3): >>> <<< 41175 1727204634.81147: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.81338: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.81572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 41175 1727204634.81588: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.81649: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.81704: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.82555: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.83290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 41175 1727204634.83298: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 41175 1727204634.83427: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.83496: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.83678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 41175 1727204634.83759: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.83861: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.84030: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 41175 1727204634.84047: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.84331: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.84658: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 41175 1727204634.84703: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.84771: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 41175 1727204634.84778: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.84958: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.85151: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.85532: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.85779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 41175 1727204634.85801: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.85829: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.85869: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 41175 1727204634.85911: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.85914: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.85938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 41175 1727204634.86025: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.86103: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 41175 1727204634.86131: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.86154: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.86178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 41175 1727204634.86242: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.86308: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 41175 1727204634.86329: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.86369: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.86439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 41175 1727204634.86745: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87045: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 41175 1727204634.87060: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87118: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 41175 1727204634.87197: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87233: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 41175 1727204634.87283: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87313: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 41175 1727204634.87365: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87398: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87427: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 41175 1727204634.87450: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87519: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 41175 1727204634.87627: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87654: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 41175 1727204634.87706: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87766: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 41175 1727204634.87771: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87811: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87814: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87859: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87915: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.87987: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 41175 1727204634.88088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 41175 1727204634.88113: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88143: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88217: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 41175 1727204634.88222: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88428: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 41175 1727204634.88662: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88712: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88762: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 41175 1727204634.88773: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88816: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 41175 1727204634.88889: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.88972: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.89068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 41175 1727204634.89086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 41175 1727204634.89183: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.89278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 41175 1727204634.89357: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204634.89979: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 41175 1727204634.90017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 41175 1727204634.90034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 41175 1727204634.90091: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e8fc7b30> <<< 41175 1727204634.90118: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e8fc47a0> <<< 41175 1727204634.90149: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e8fc5490> <<< 41175 1727204634.90817: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJe<<< 41175 1727204634.90847: stdout chunk (state=3): >>>v9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "54", "epoch": "1727204634", "epoch_int": "1727204634", "date": "2024-09-24", "time": "15:03:54", "iso8601_micro": "2024-09-24T19:03:54.898437Z", "iso8601": "2024-09-24T19:03:54Z", "iso8601_basic": "20240924T150354898437", "iso8601_basic_short": "20240924T150354", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41175 1727204634.91437: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 41175 1727204634.91450: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal <<< 41175 1727204634.91474: stdout chunk (state=3): >>># cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 41175 1727204634.91517: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 41175 1727204634.91558: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select <<< 41175 1727204634.91604: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux <<< 41175 1727204634.91608: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale <<< 41175 1727204634.91624: stdout chunk (state=3): >>># destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ <<< 41175 1727204634.91650: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg <<< 41175 1727204634.91683: stdout chunk (state=3): >>># cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd <<< 41175 1727204634.91716: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 41175 1727204634.92040: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41175 1727204634.92092: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 41175 1727204634.92112: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 41175 1727204634.92133: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 41175 1727204634.92164: stdout chunk (state=3): >>># destroy ntpath <<< 41175 1727204634.92194: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 41175 1727204634.92238: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 41175 1727204634.92249: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 41175 1727204634.92275: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 41175 1727204634.92312: stdout chunk (state=3): >>># destroy _hashlib <<< 41175 1727204634.92337: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 41175 1727204634.92374: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 41175 1727204634.92409: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 41175 1727204634.92437: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata <<< 41175 1727204634.92469: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl <<< 41175 1727204634.92486: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 41175 1727204634.92520: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 41175 1727204634.92556: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 41175 1727204634.92578: stdout chunk (state=3): >>># destroy glob <<< 41175 1727204634.92613: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 41175 1727204634.92617: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 41175 1727204634.92673: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 41175 1727204634.92710: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 41175 1727204634.92742: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 41175 1727204634.92775: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external <<< 41175 1727204634.92825: stdout chunk (state=3): >>># cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 41175 1727204634.92828: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41175 1727204634.92970: stdout chunk (state=3): >>># destroy sys.monitoring <<< 41175 1727204634.92973: stdout chunk (state=3): >>># destroy _socket <<< 41175 1727204634.93017: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 41175 1727204634.93048: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 41175 1727204634.93075: stdout chunk (state=3): >>># destroy _typing <<< 41175 1727204634.93106: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 41175 1727204634.93144: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 41175 1727204634.93156: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 41175 1727204634.93242: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 41175 1727204634.93274: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 41175 1727204634.93288: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 41175 1727204634.93331: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools <<< 41175 1727204634.93361: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 41175 1727204634.93380: stdout chunk (state=3): >>># clear sys.audit hooks <<< 41175 1727204634.93850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204634.93854: stdout chunk (state=3): >>><<< 41175 1727204634.93856: stderr chunk (state=3): >>><<< 41175 1727204634.94214: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea2d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea2a3ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea2d6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea0c50a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea0c5fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea103e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea103f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea13b860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea13bef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea11bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea119280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea101040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea15f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea15e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea11a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea102f30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea190740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1002c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea190bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea190aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea190e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea0fede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea191520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1911f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea192420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1ac650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea1add60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1aec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea1af2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1ae1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4ea1afd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea1af470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea192480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ea3cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ecc7a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ecc500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ecc7d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ecc9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ea1e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ece000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9eccc80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4ea192b70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9efa3c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f12510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f4b2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f71a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f4b410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f131a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9d94410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9f11550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ecef60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd4e9d946e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_ngmk5yfx/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e021e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9dd90d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9dd8230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ddb5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9e31bb0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e31970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e31280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e319d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e02e70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9e328d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9e32b10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9e33020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c98dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9c9a9f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9b380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9c2c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9efc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9c9f0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9d280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca2f60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca1a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca1790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ca3f20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c9d790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9ce7140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ce72c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cece90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cecc50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cef3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ced580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf6b40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cef4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cf7920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cf7b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cf7ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9ce75c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cfb5c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cfc680> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf9d30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9cfb0b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf9910> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b84860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b85700> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9cf8230> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b85dc0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b865d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b8e060> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b8e9f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b87440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e9b8d8b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b8eb40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c22d20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b98a70> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b96ba0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9b969f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c25a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9148350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e91486b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c053d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c04620> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c241a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c27c20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e914b650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e914af00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e914b0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e914a330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e914b830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e91b2330> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91b0350> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e9c27e00> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91b2660> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91b3290> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e91e6720> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e91cf170> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e8f9e060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e8f9dd60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd4e8fc7b30> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e8fc47a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd4e8fc5490> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "54", "epoch": "1727204634", "epoch_int": "1727204634", "date": "2024-09-24", "time": "15:03:54", "iso8601_micro": "2024-09-24T19:03:54.898437Z", "iso8601": "2024-09-24T19:03:54Z", "iso8601_basic": "20240924T150354898437", "iso8601_basic_short": "20240924T150354", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 41175 1727204634.95829: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204634.95832: _low_level_execute_command(): starting 41175 1727204634.95835: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204634.3794992-41295-21111145522117/ > /dev/null 2>&1 && sleep 0' 41175 1727204634.96009: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204634.96026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204634.96056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204634.96080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204634.96100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204634.96160: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204634.96229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204634.96256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204634.96297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204634.96376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204634.98347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204634.98371: stderr chunk (state=3): >>><<< 41175 1727204634.98380: stdout chunk (state=3): >>><<< 41175 1727204634.98406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204634.98422: handler run complete 41175 1727204634.98574: variable 'ansible_facts' from source: unknown 41175 1727204634.98587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.98782: variable 'ansible_facts' from source: unknown 41175 1727204634.98864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204634.98965: attempt loop complete, returning result 41175 1727204634.98973: _execute() done 41175 1727204634.98981: dumping result to json 41175 1727204634.99006: done dumping result, returning 41175 1727204634.99110: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-f070-39c4-000000000106] 41175 1727204634.99115: sending task result for task 12b410aa-8751-f070-39c4-000000000106 41175 1727204634.99398: done sending task result for task 12b410aa-8751-f070-39c4-000000000106 41175 1727204634.99403: WORKER PROCESS EXITING ok: [managed-node3] 41175 1727204634.99637: no more pending results, returning what we have 41175 1727204634.99641: results queue empty 41175 1727204634.99642: checking for any_errors_fatal 41175 1727204634.99644: done checking for any_errors_fatal 41175 1727204634.99645: checking for max_fail_percentage 41175 1727204634.99647: done checking for max_fail_percentage 41175 1727204634.99647: checking to see if all hosts have failed and the running result is not ok 41175 1727204634.99649: done checking to see if all hosts have failed 41175 1727204634.99650: getting the remaining hosts for this loop 41175 1727204634.99651: done getting the remaining hosts for this loop 41175 1727204634.99660: getting the next task for host managed-node3 41175 1727204634.99677: done getting next task for host managed-node3 41175 1727204634.99680: ^ task is: TASK: Check if system is ostree 41175 1727204634.99683: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204634.99686: getting variables 41175 1727204634.99688: in VariableManager get_vars() 41175 1727204634.99745: Calling all_inventory to load vars for managed-node3 41175 1727204634.99749: Calling groups_inventory to load vars for managed-node3 41175 1727204634.99754: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204634.99771: Calling all_plugins_play to load vars for managed-node3 41175 1727204634.99782: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204634.99788: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.00057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.00247: done with get_vars() 41175 1727204635.00256: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.719) 0:00:02.142 ***** 41175 1727204635.00337: entering _queue_task() for managed-node3/stat 41175 1727204635.00555: worker is 1 (out of 1 available) 41175 1727204635.00569: exiting _queue_task() for managed-node3/stat 41175 1727204635.00581: done queuing things up, now waiting for results queue to drain 41175 1727204635.00582: waiting for pending results... 41175 1727204635.00740: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 41175 1727204635.00824: in run() - task 12b410aa-8751-f070-39c4-000000000108 41175 1727204635.00834: variable 'ansible_search_path' from source: unknown 41175 1727204635.00837: variable 'ansible_search_path' from source: unknown 41175 1727204635.00868: calling self._execute() 41175 1727204635.00937: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.00940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.00952: variable 'omit' from source: magic vars 41175 1727204635.01348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204635.01574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204635.01619: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204635.01646: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204635.01675: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204635.01767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204635.01826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204635.01849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204635.01871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204635.02059: Evaluated conditional (not __network_is_ostree is defined): True 41175 1727204635.02063: variable 'omit' from source: magic vars 41175 1727204635.02068: variable 'omit' from source: magic vars 41175 1727204635.02210: variable 'omit' from source: magic vars 41175 1727204635.02214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204635.02217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204635.02219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204635.02222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.02224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.02243: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204635.02246: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.02271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.02420: Set connection var ansible_shell_executable to /bin/sh 41175 1727204635.02441: Set connection var ansible_shell_type to sh 41175 1727204635.02453: Set connection var ansible_pipelining to False 41175 1727204635.02469: Set connection var ansible_timeout to 10 41175 1727204635.02479: Set connection var ansible_connection to ssh 41175 1727204635.02491: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204635.02522: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.02534: variable 'ansible_connection' from source: unknown 41175 1727204635.02554: variable 'ansible_module_compression' from source: unknown 41175 1727204635.02561: variable 'ansible_shell_type' from source: unknown 41175 1727204635.02568: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.02575: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.02584: variable 'ansible_pipelining' from source: unknown 41175 1727204635.02593: variable 'ansible_timeout' from source: unknown 41175 1727204635.02603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.02806: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204635.02822: variable 'omit' from source: magic vars 41175 1727204635.02833: starting attempt loop 41175 1727204635.02840: running the handler 41175 1727204635.02877: _low_level_execute_command(): starting 41175 1727204635.02897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204635.03461: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204635.03465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204635.03470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.03521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204635.03525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204635.03567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41175 1727204635.05974: stdout chunk (state=3): >>>/root <<< 41175 1727204635.06111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204635.06209: stderr chunk (state=3): >>><<< 41175 1727204635.06234: stdout chunk (state=3): >>><<< 41175 1727204635.06259: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41175 1727204635.06403: _low_level_execute_command(): starting 41175 1727204635.06408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183 `" && echo ansible-tmp-1727204635.0628974-41309-159945870145183="` echo /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183 `" ) && sleep 0' 41175 1727204635.07030: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204635.07044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204635.07096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.07111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204635.07212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204635.07249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204635.07322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41175 1727204635.09808: stdout chunk (state=3): >>>ansible-tmp-1727204635.0628974-41309-159945870145183=/root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183 <<< 41175 1727204635.10006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204635.10034: stderr chunk (state=3): >>><<< 41175 1727204635.10037: stdout chunk (state=3): >>><<< 41175 1727204635.10056: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204635.0628974-41309-159945870145183=/root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41175 1727204635.10106: variable 'ansible_module_compression' from source: unknown 41175 1727204635.10155: ANSIBALLZ: Using lock for stat 41175 1727204635.10159: ANSIBALLZ: Acquiring lock 41175 1727204635.10161: ANSIBALLZ: Lock acquired: 140088839298064 41175 1727204635.10164: ANSIBALLZ: Creating module 41175 1727204635.21444: ANSIBALLZ: Writing module into payload 41175 1727204635.21532: ANSIBALLZ: Writing module 41175 1727204635.21546: ANSIBALLZ: Renaming module 41175 1727204635.21554: ANSIBALLZ: Done creating module 41175 1727204635.21569: variable 'ansible_facts' from source: unknown 41175 1727204635.21628: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/AnsiballZ_stat.py 41175 1727204635.21739: Sending initial data 41175 1727204635.21742: Sent initial data (153 bytes) 41175 1727204635.22229: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204635.22233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.22235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204635.22238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.22288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204635.22296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204635.22360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41175 1727204635.24809: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204635.24842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204635.24884: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpqufab0rm /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/AnsiballZ_stat.py <<< 41175 1727204635.24888: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/AnsiballZ_stat.py" <<< 41175 1727204635.24928: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpqufab0rm" to remote "/root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/AnsiballZ_stat.py" <<< 41175 1727204635.24930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/AnsiballZ_stat.py" <<< 41175 1727204635.25745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204635.25822: stderr chunk (state=3): >>><<< 41175 1727204635.25826: stdout chunk (state=3): >>><<< 41175 1727204635.25843: done transferring module to remote 41175 1727204635.25861: _low_level_execute_command(): starting 41175 1727204635.25864: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/ /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/AnsiballZ_stat.py && sleep 0' 41175 1727204635.26354: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204635.26358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204635.26360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.26363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204635.26365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.26425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204635.26430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204635.26466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41175 1727204635.29103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204635.29154: stderr chunk (state=3): >>><<< 41175 1727204635.29158: stdout chunk (state=3): >>><<< 41175 1727204635.29176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41175 1727204635.29180: _low_level_execute_command(): starting 41175 1727204635.29186: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/AnsiballZ_stat.py && sleep 0' 41175 1727204635.29653: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204635.29656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.29659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204635.29661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204635.29664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.29714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204635.29723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204635.29767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41175 1727204635.33169: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 41175 1727204635.33230: stdout chunk (state=3): >>>import '_io' # <<< 41175 1727204635.33246: stdout chunk (state=3): >>>import 'marshal' # <<< 41175 1727204635.33304: stdout chunk (state=3): >>>import 'posix' # <<< 41175 1727204635.33367: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 41175 1727204635.33382: stdout chunk (state=3): >>> <<< 41175 1727204635.33395: stdout chunk (state=3): >>># installing zipimport hook <<< 41175 1727204635.33561: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 41175 1727204635.33567: stdout chunk (state=3): >>>import 'codecs' # <<< 41175 1727204635.33624: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41175 1727204635.33663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 41175 1727204635.33676: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc40c4d0> <<< 41175 1727204635.33702: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc3dbad0> <<< 41175 1727204635.33740: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 41175 1727204635.33744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41175 1727204635.33769: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc40ea20> <<< 41175 1727204635.33805: stdout chunk (state=3): >>>import '_signal' # <<< 41175 1727204635.33837: stdout chunk (state=3): >>>import '_abc' # <<< 41175 1727204635.33857: stdout chunk (state=3): >>>import 'abc' # <<< 41175 1727204635.33895: stdout chunk (state=3): >>>import 'io' # <<< 41175 1727204635.33901: stdout chunk (state=3): >>> <<< 41175 1727204635.33951: stdout chunk (state=3): >>>import '_stat' # <<< 41175 1727204635.33959: stdout chunk (state=3): >>>import 'stat' # <<< 41175 1727204635.34110: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41175 1727204635.34262: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 41175 1727204635.34272: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 41175 1727204635.34312: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 41175 1727204635.34326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 41175 1727204635.34359: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1bd0a0> <<< 41175 1727204635.34447: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 41175 1727204635.34469: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.34493: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1bdfd0> <<< 41175 1727204635.34536: stdout chunk (state=3): >>>import 'site' # <<< 41175 1727204635.34584: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41175 1727204635.34980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 41175 1727204635.35153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 41175 1727204635.35176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 41175 1727204635.35202: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1fbe90> <<< 41175 1727204635.35235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 41175 1727204635.35262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 41175 1727204635.35301: stdout chunk (state=3): >>>import '_operator' # <<< 41175 1727204635.35316: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1fbf50> <<< 41175 1727204635.35351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 41175 1727204635.35391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 41175 1727204635.35430: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41175 1727204635.35506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.35537: stdout chunk (state=3): >>>import 'itertools' # <<< 41175 1727204635.35568: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 41175 1727204635.35754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc233860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc233ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc213b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc211280> <<< 41175 1727204635.35909: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1f9040> <<< 41175 1727204635.35951: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41175 1727204635.35984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41175 1727204635.36007: stdout chunk (state=3): >>>import '_sre' # <<< 41175 1727204635.36261: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc257740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc256360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc212270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1faf30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 41175 1727204635.36269: stdout chunk (state=3): >>> <<< 41175 1727204635.36290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 41175 1727204635.36294: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc288740> <<< 41175 1727204635.36310: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1f82c0> <<< 41175 1727204635.36347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 41175 1727204635.36357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41175 1727204635.36399: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.36425: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc288bf0> <<< 41175 1727204635.36433: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc288aa0> <<< 41175 1727204635.36476: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.36502: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc288e30> <<< 41175 1727204635.36525: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1f6de0> <<< 41175 1727204635.36577: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 41175 1727204635.36581: stdout chunk (state=3): >>> <<< 41175 1727204635.36600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.36630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 41175 1727204635.36634: stdout chunk (state=3): >>> <<< 41175 1727204635.36679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc'<<< 41175 1727204635.36685: stdout chunk (state=3): >>> <<< 41175 1727204635.36711: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc289520><<< 41175 1727204635.36724: stdout chunk (state=3): >>> <<< 41175 1727204635.36733: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2891f0><<< 41175 1727204635.36752: stdout chunk (state=3): >>> <<< 41175 1727204635.36763: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 41175 1727204635.36806: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py<<< 41175 1727204635.36819: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 41175 1727204635.36851: stdout chunk (state=3): >>> import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc28a420><<< 41175 1727204635.36856: stdout chunk (state=3): >>> <<< 41175 1727204635.36879: stdout chunk (state=3): >>>import 'importlib.util' # <<< 41175 1727204635.36904: stdout chunk (state=3): >>> import 'runpy' # <<< 41175 1727204635.36946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 41175 1727204635.37027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 41175 1727204635.37050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a4650><<< 41175 1727204635.37081: stdout chunk (state=3): >>> import 'errno' # <<< 41175 1727204635.37121: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 41175 1727204635.37150: stdout chunk (state=3): >>> # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 41175 1727204635.37153: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc2a5d60><<< 41175 1727204635.37191: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 41175 1727204635.37195: stdout chunk (state=3): >>> <<< 41175 1727204635.37219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 41175 1727204635.37227: stdout chunk (state=3): >>> <<< 41175 1727204635.37261: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 41175 1727204635.37285: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a6c60><<< 41175 1727204635.37292: stdout chunk (state=3): >>> <<< 41175 1727204635.37340: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 41175 1727204635.37366: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc2a72c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a61b0><<< 41175 1727204635.37372: stdout chunk (state=3): >>> <<< 41175 1727204635.37398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 41175 1727204635.37425: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 41175 1727204635.37474: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 41175 1727204635.37487: stdout chunk (state=3): >>> <<< 41175 1727204635.37509: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 41175 1727204635.37512: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc2a7d40> <<< 41175 1727204635.37545: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a7470><<< 41175 1727204635.37552: stdout chunk (state=3): >>> <<< 41175 1727204635.37641: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc28a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 41175 1727204635.37648: stdout chunk (state=3): >>> <<< 41175 1727204635.37687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 41175 1727204635.37721: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 41175 1727204635.37766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 41175 1727204635.37812: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 41175 1727204635.37831: stdout chunk (state=3): >>> <<< 41175 1727204635.37844: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc03bcb0><<< 41175 1727204635.37855: stdout chunk (state=3): >>> <<< 41175 1727204635.37886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 41175 1727204635.37926: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.37943: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.37998: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc0647a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc064500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.38010: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.38053: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc0647d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.38074: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.38113: stdout chunk (state=3): >>>import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc0649b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc039e50> <<< 41175 1727204635.38150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 41175 1727204635.38248: stdout chunk (state=3): >>> <<< 41175 1727204635.38328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 41175 1727204635.38367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 41175 1727204635.38396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 41175 1727204635.38423: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc066000> <<< 41175 1727204635.38466: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc064c80> <<< 41175 1727204635.38506: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc28ab70> <<< 41175 1727204635.38552: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41175 1727204635.38637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 41175 1727204635.38672: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 41175 1727204635.38678: stdout chunk (state=3): >>> <<< 41175 1727204635.38738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 41175 1727204635.38792: stdout chunk (state=3): >>> import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0923c0> <<< 41175 1727204635.38871: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 41175 1727204635.38905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.38941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 41175 1727204635.39052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0aa510> <<< 41175 1727204635.39094: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 41175 1727204635.39099: stdout chunk (state=3): >>> <<< 41175 1727204635.39160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 41175 1727204635.39166: stdout chunk (state=3): >>> <<< 41175 1727204635.39268: stdout chunk (state=3): >>>import 'ntpath' # <<< 41175 1727204635.39274: stdout chunk (state=3): >>> <<< 41175 1727204635.39316: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 41175 1727204635.39319: stdout chunk (state=3): >>> <<< 41175 1727204635.39332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0e32f0><<< 41175 1727204635.39365: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 41175 1727204635.39368: stdout chunk (state=3): >>> <<< 41175 1727204635.39428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 41175 1727204635.39469: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 41175 1727204635.39472: stdout chunk (state=3): >>> <<< 41175 1727204635.39541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 41175 1727204635.39547: stdout chunk (state=3): >>> <<< 41175 1727204635.39686: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc109a90><<< 41175 1727204635.39800: stdout chunk (state=3): >>> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0e3410> <<< 41175 1727204635.39876: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0ab1a0> <<< 41175 1727204635.39930: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 41175 1727204635.39940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 41175 1727204635.39958: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf20410> <<< 41175 1727204635.39992: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0a9550><<< 41175 1727204635.39995: stdout chunk (state=3): >>> <<< 41175 1727204635.40009: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc066f60> <<< 41175 1727204635.40163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41175 1727204635.40199: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe4dbf206e0><<< 41175 1727204635.40322: stdout chunk (state=3): >>> # zipimport: found 30 names in '/tmp/ansible_stat_payload_rznd6us8/ansible_stat_payload.zip'<<< 41175 1727204635.40326: stdout chunk (state=3): >>> <<< 41175 1727204635.40351: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.40631: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.40679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 41175 1727204635.40712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41175 1727204635.40782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 41175 1727204635.40791: stdout chunk (state=3): >>> <<< 41175 1727204635.40913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 41175 1727204635.40960: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 41175 1727204635.40972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 41175 1727204635.41008: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf7a1e0> import '_typing' # <<< 41175 1727204635.41327: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf510d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf50230><<< 41175 1727204635.41353: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41175 1727204635.41412: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available<<< 41175 1727204635.41419: stdout chunk (state=3): >>> <<< 41175 1727204635.41462: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 41175 1727204635.41469: stdout chunk (state=3): >>> <<< 41175 1727204635.41508: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 41175 1727204635.44158: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.45730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 41175 1727204635.45740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 41175 1727204635.45747: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf535f0> <<< 41175 1727204635.45791: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 41175 1727204635.45795: stdout chunk (state=3): >>> <<< 41175 1727204635.45819: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 41175 1727204635.45838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 41175 1727204635.45869: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 41175 1727204635.45877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41175 1727204635.46013: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbfa5be0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa5970> <<< 41175 1727204635.46260: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa5280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa59d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf7ae70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbfa6900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbfa6b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 41175 1727204635.46284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41175 1727204635.46318: stdout chunk (state=3): >>>import '_locale' # <<< 41175 1727204635.46382: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa6ff0> <<< 41175 1727204635.46408: stdout chunk (state=3): >>>import 'pwd' # <<< 41175 1727204635.46439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 41175 1727204635.46483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 41175 1727204635.46543: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe08d70> <<< 41175 1727204635.46585: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.46604: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.46610: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe0a990> <<< 41175 1727204635.46639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 41175 1727204635.46671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41175 1727204635.46733: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0b350> <<< 41175 1727204635.46765: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 41175 1727204635.46808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 41175 1727204635.46840: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0c530> <<< 41175 1727204635.46875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 41175 1727204635.46933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 41175 1727204635.46969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 41175 1727204635.46986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 41175 1727204635.47080: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0efc0> <<< 41175 1727204635.47138: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.47155: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.47163: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe0f320> <<< 41175 1727204635.47196: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0d280> <<< 41175 1727204635.47228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41175 1727204635.47277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 41175 1727204635.47354: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 41175 1727204635.47384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 41175 1727204635.47423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 41175 1727204635.47430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 41175 1727204635.47454: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe12f30> <<< 41175 1727204635.47475: stdout chunk (state=3): >>>import '_tokenize' # <<< 41175 1727204635.47757: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe11a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe11760> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe13f80> <<< 41175 1727204635.47791: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0d790> <<< 41175 1727204635.47839: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.47843: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.47860: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe5b170> <<< 41175 1727204635.47899: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5b2f0><<< 41175 1727204635.47905: stdout chunk (state=3): >>> <<< 41175 1727204635.47948: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 41175 1727204635.47955: stdout chunk (state=3): >>> <<< 41175 1727204635.47983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 41175 1727204635.47999: stdout chunk (state=3): >>> <<< 41175 1727204635.48034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 41175 1727204635.48038: stdout chunk (state=3): >>> <<< 41175 1727204635.48041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 41175 1727204635.48057: stdout chunk (state=3): >>> <<< 41175 1727204635.48114: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe5ce90> <<< 41175 1727204635.48130: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5cc80> <<< 41175 1727204635.48160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 41175 1727204635.48339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 41175 1727204635.48386: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe5f3e0> <<< 41175 1727204635.48409: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5d580> <<< 41175 1727204635.48412: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41175 1727204635.48503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.48525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 41175 1727204635.48555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 41175 1727204635.48597: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe66b40> <<< 41175 1727204635.48762: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5f4d0> <<< 41175 1727204635.48845: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe67da0> <<< 41175 1727204635.48876: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe67b00> <<< 41175 1727204635.48935: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.48949: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe67ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5b5f0> <<< 41175 1727204635.49001: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 41175 1727204635.49009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 41175 1727204635.49033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 41175 1727204635.49057: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.49100: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe6b590> <<< 41175 1727204635.49320: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe6c470> <<< 41175 1727204635.49323: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe69d30> <<< 41175 1727204635.49361: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe6b080> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe69910> <<< 41175 1727204635.49585: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 41175 1727204635.49782: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 41175 1727204635.49815: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.50120: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.50259: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.51477: stdout chunk (state=3): >>># zipimport: zlib available<<< 41175 1727204635.51485: stdout chunk (state=3): >>> <<< 41175 1727204635.52479: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 41175 1727204635.52505: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 41175 1727204635.52524: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 41175 1727204635.52561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.52614: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbef47d0> <<< 41175 1727204635.52727: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 41175 1727204635.52742: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef55b0> <<< 41175 1727204635.52777: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe6f050> <<< 41175 1727204635.52836: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 41175 1727204635.52839: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.52860: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.52888: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 41175 1727204635.52900: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.53064: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.53247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 41175 1727204635.53281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef5310> <<< 41175 1727204635.53284: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.53843: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54405: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54481: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54575: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 41175 1727204635.54588: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54633: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54680: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 41175 1727204635.54684: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54769: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54899: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 41175 1727204635.54903: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.54930: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 41175 1727204635.54985: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.55025: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41175 1727204635.55041: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.55319: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.55604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 41175 1727204635.55686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41175 1727204635.55690: stdout chunk (state=3): >>>import '_ast' # <<< 41175 1727204635.55794: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef6240> <<< 41175 1727204635.55798: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.55879: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.55971: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 41175 1727204635.55986: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 41175 1727204635.56022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41175 1727204635.56102: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.56233: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbefe0c0> <<< 41175 1727204635.56291: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 41175 1727204635.56312: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbefea20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef6f60> <<< 41175 1727204635.56325: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.56368: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.56422: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41175 1727204635.56425: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.56464: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.56519: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.56575: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.56649: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41175 1727204635.56698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.56794: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbefd850> <<< 41175 1727204635.56836: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbefecc0> <<< 41175 1727204635.56886: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 41175 1727204635.56890: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.56952: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.57035: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.57047: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.57103: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 41175 1727204635.57122: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 41175 1727204635.57155: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 41175 1727204635.57177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41175 1727204635.57243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 41175 1727204635.57270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 41175 1727204635.57293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 41175 1727204635.57337: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd92cf0> <<< 41175 1727204635.57393: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd0ca40> <<< 41175 1727204635.57474: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd0aab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd0a900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 41175 1727204635.57504: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.57535: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.57551: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 41175 1727204635.57634: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 41175 1727204635.57648: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 41175 1727204635.57664: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.57811: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.58023: stdout chunk (state=3): >>># zipimport: zlib available <<< 41175 1727204635.58161: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 41175 1727204635.58191: stdout chunk (state=3): >>># destroy __main__ <<< 41175 1727204635.58532: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 41175 1727204635.58556: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 41175 1727204635.58577: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 41175 1727204635.58609: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre <<< 41175 1727204635.58613: stdout chunk (state=3): >>># cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random <<< 41175 1727204635.58662: stdout chunk (state=3): >>># destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime <<< 41175 1727204635.58701: stdout chunk (state=3): >>># cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 41175 1727204635.58714: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 41175 1727204635.58947: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41175 1727204635.58988: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 41175 1727204635.59018: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 41175 1727204635.59032: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 41175 1727204635.59064: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux <<< 41175 1727204635.59086: stdout chunk (state=3): >>># destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 41175 1727204635.59151: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 41175 1727204635.59156: stdout chunk (state=3): >>># destroy selectors # destroy errno <<< 41175 1727204635.59178: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 41175 1727204635.59209: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 41175 1727204635.59256: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 41175 1727204635.59277: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 41175 1727204635.59334: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref <<< 41175 1727204635.59369: stdout chunk (state=3): >>># cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 41175 1727204635.59383: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 41175 1727204635.59425: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 41175 1727204635.59470: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 41175 1727204635.59473: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41175 1727204635.59593: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 41175 1727204635.59634: stdout chunk (state=3): >>># destroy _collections <<< 41175 1727204635.59637: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 41175 1727204635.59668: stdout chunk (state=3): >>># destroy tokenize <<< 41175 1727204635.59710: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 41175 1727204635.59738: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 41175 1727204635.59742: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 41175 1727204635.59847: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 41175 1727204635.59850: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 41175 1727204635.59874: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 41175 1727204635.59916: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 41175 1727204635.59959: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 41175 1727204635.59962: stdout chunk (state=3): >>># clear sys.audit hooks <<< 41175 1727204635.60466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204635.60492: stdout chunk (state=3): >>><<< 41175 1727204635.60496: stderr chunk (state=3): >>><<< 41175 1727204635.60675: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc40c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc3dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc40ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1bd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1bdfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1fbe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1fbf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc233860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc233ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc213b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc211280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1f9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc257740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc256360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc212270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1faf30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc288740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1f82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc288bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc288aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc288e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc1f6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc289520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2891f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc28a420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a4650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc2a5d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a6c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc2a72c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a61b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc2a7d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc2a7470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc28a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc03bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc0647a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc064500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc0647d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dc0649b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc039e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc066000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc064c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc28ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0923c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0aa510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0e32f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc109a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0e3410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0ab1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf20410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc0a9550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dc066f60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe4dbf206e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_rznd6us8/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf7a1e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf510d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf50230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf535f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbfa5be0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa5970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa5280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa59d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbf7ae70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbfa6900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbfa6b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbfa6ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe08d70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe0a990> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0b350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0c530> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0efc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe0f320> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0d280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe12f30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe11a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe11760> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe13f80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe0d790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe5b170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5b2f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe5ce90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5cc80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe5f3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5d580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe66b40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5f4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe67da0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe67b00> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe67ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe5b5f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe6b590> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe6c470> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe69d30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbe6b080> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe69910> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbef47d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef55b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbe6f050> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef5310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef6240> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbefe0c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbefea20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbef6f60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4dbefd850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbefecc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd92cf0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd0ca40> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd0aab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4dbd0a900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 41175 1727204635.61445: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204635.61448: _low_level_execute_command(): starting 41175 1727204635.61451: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204635.0628974-41309-159945870145183/ > /dev/null 2>&1 && sleep 0' 41175 1727204635.61861: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204635.61882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204635.61902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204635.61934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204635.61998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204635.62067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204635.62086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204635.62116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204635.62194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204635.64273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204635.64316: stdout chunk (state=3): >>><<< 41175 1727204635.64320: stderr chunk (state=3): >>><<< 41175 1727204635.64338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204635.64495: handler run complete 41175 1727204635.64499: attempt loop complete, returning result 41175 1727204635.64502: _execute() done 41175 1727204635.64504: dumping result to json 41175 1727204635.64506: done dumping result, returning 41175 1727204635.64508: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [12b410aa-8751-f070-39c4-000000000108] 41175 1727204635.64510: sending task result for task 12b410aa-8751-f070-39c4-000000000108 41175 1727204635.64578: done sending task result for task 12b410aa-8751-f070-39c4-000000000108 41175 1727204635.64581: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 41175 1727204635.64661: no more pending results, returning what we have 41175 1727204635.64664: results queue empty 41175 1727204635.64665: checking for any_errors_fatal 41175 1727204635.64674: done checking for any_errors_fatal 41175 1727204635.64675: checking for max_fail_percentage 41175 1727204635.64676: done checking for max_fail_percentage 41175 1727204635.64677: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.64678: done checking to see if all hosts have failed 41175 1727204635.64679: getting the remaining hosts for this loop 41175 1727204635.64681: done getting the remaining hosts for this loop 41175 1727204635.64686: getting the next task for host managed-node3 41175 1727204635.64695: done getting next task for host managed-node3 41175 1727204635.64699: ^ task is: TASK: Set flag to indicate system is ostree 41175 1727204635.64702: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.64706: getting variables 41175 1727204635.64707: in VariableManager get_vars() 41175 1727204635.64740: Calling all_inventory to load vars for managed-node3 41175 1727204635.64743: Calling groups_inventory to load vars for managed-node3 41175 1727204635.64747: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.64761: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.64765: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.64768: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.65403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.65798: done with get_vars() 41175 1727204635.65812: done getting variables 41175 1727204635.65926: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.656) 0:00:02.798 ***** 41175 1727204635.65959: entering _queue_task() for managed-node3/set_fact 41175 1727204635.65961: Creating lock for set_fact 41175 1727204635.66287: worker is 1 (out of 1 available) 41175 1727204635.66303: exiting _queue_task() for managed-node3/set_fact 41175 1727204635.66315: done queuing things up, now waiting for results queue to drain 41175 1727204635.66316: waiting for pending results... 41175 1727204635.66718: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 41175 1727204635.66723: in run() - task 12b410aa-8751-f070-39c4-000000000109 41175 1727204635.66740: variable 'ansible_search_path' from source: unknown 41175 1727204635.66807: variable 'ansible_search_path' from source: unknown 41175 1727204635.66811: calling self._execute() 41175 1727204635.66894: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.66911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.66941: variable 'omit' from source: magic vars 41175 1727204635.67568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204635.67898: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204635.67973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204635.68036: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204635.68096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204635.68212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204635.68240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204635.68271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204635.68295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204635.68415: Evaluated conditional (not __network_is_ostree is defined): True 41175 1727204635.68424: variable 'omit' from source: magic vars 41175 1727204635.68457: variable 'omit' from source: magic vars 41175 1727204635.68560: variable '__ostree_booted_stat' from source: set_fact 41175 1727204635.68604: variable 'omit' from source: magic vars 41175 1727204635.68629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204635.68652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204635.68668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204635.68690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.68701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.68731: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204635.68734: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.68737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.68827: Set connection var ansible_shell_executable to /bin/sh 41175 1727204635.68830: Set connection var ansible_shell_type to sh 41175 1727204635.68837: Set connection var ansible_pipelining to False 41175 1727204635.68845: Set connection var ansible_timeout to 10 41175 1727204635.68852: Set connection var ansible_connection to ssh 41175 1727204635.68858: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204635.68879: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.68882: variable 'ansible_connection' from source: unknown 41175 1727204635.68885: variable 'ansible_module_compression' from source: unknown 41175 1727204635.68889: variable 'ansible_shell_type' from source: unknown 41175 1727204635.68895: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.68898: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.68902: variable 'ansible_pipelining' from source: unknown 41175 1727204635.68913: variable 'ansible_timeout' from source: unknown 41175 1727204635.68916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.69026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204635.69037: variable 'omit' from source: magic vars 41175 1727204635.69043: starting attempt loop 41175 1727204635.69047: running the handler 41175 1727204635.69058: handler run complete 41175 1727204635.69067: attempt loop complete, returning result 41175 1727204635.69070: _execute() done 41175 1727204635.69073: dumping result to json 41175 1727204635.69078: done dumping result, returning 41175 1727204635.69085: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [12b410aa-8751-f070-39c4-000000000109] 41175 1727204635.69091: sending task result for task 12b410aa-8751-f070-39c4-000000000109 41175 1727204635.69181: done sending task result for task 12b410aa-8751-f070-39c4-000000000109 41175 1727204635.69184: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 41175 1727204635.69277: no more pending results, returning what we have 41175 1727204635.69280: results queue empty 41175 1727204635.69281: checking for any_errors_fatal 41175 1727204635.69288: done checking for any_errors_fatal 41175 1727204635.69294: checking for max_fail_percentage 41175 1727204635.69296: done checking for max_fail_percentage 41175 1727204635.69297: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.69298: done checking to see if all hosts have failed 41175 1727204635.69299: getting the remaining hosts for this loop 41175 1727204635.69306: done getting the remaining hosts for this loop 41175 1727204635.69311: getting the next task for host managed-node3 41175 1727204635.69320: done getting next task for host managed-node3 41175 1727204635.69323: ^ task is: TASK: Fix CentOS6 Base repo 41175 1727204635.69325: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.69329: getting variables 41175 1727204635.69330: in VariableManager get_vars() 41175 1727204635.69359: Calling all_inventory to load vars for managed-node3 41175 1727204635.69362: Calling groups_inventory to load vars for managed-node3 41175 1727204635.69365: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.69376: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.69379: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.69388: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.69549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.69727: done with get_vars() 41175 1727204635.69738: done getting variables 41175 1727204635.69839: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.039) 0:00:02.837 ***** 41175 1727204635.69870: entering _queue_task() for managed-node3/copy 41175 1727204635.70159: worker is 1 (out of 1 available) 41175 1727204635.70173: exiting _queue_task() for managed-node3/copy 41175 1727204635.70185: done queuing things up, now waiting for results queue to drain 41175 1727204635.70187: waiting for pending results... 41175 1727204635.70613: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 41175 1727204635.70621: in run() - task 12b410aa-8751-f070-39c4-00000000010b 41175 1727204635.70624: variable 'ansible_search_path' from source: unknown 41175 1727204635.70627: variable 'ansible_search_path' from source: unknown 41175 1727204635.70650: calling self._execute() 41175 1727204635.70740: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.70754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.70769: variable 'omit' from source: magic vars 41175 1727204635.71381: variable 'ansible_distribution' from source: facts 41175 1727204635.71413: Evaluated conditional (ansible_distribution == 'CentOS'): False 41175 1727204635.71489: when evaluation is False, skipping this task 41175 1727204635.71494: _execute() done 41175 1727204635.71497: dumping result to json 41175 1727204635.71500: done dumping result, returning 41175 1727204635.71502: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [12b410aa-8751-f070-39c4-00000000010b] 41175 1727204635.71505: sending task result for task 12b410aa-8751-f070-39c4-00000000010b 41175 1727204635.71575: done sending task result for task 12b410aa-8751-f070-39c4-00000000010b 41175 1727204635.71578: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 41175 1727204635.71661: no more pending results, returning what we have 41175 1727204635.71665: results queue empty 41175 1727204635.71666: checking for any_errors_fatal 41175 1727204635.71670: done checking for any_errors_fatal 41175 1727204635.71671: checking for max_fail_percentage 41175 1727204635.71672: done checking for max_fail_percentage 41175 1727204635.71673: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.71674: done checking to see if all hosts have failed 41175 1727204635.71675: getting the remaining hosts for this loop 41175 1727204635.71677: done getting the remaining hosts for this loop 41175 1727204635.71680: getting the next task for host managed-node3 41175 1727204635.71686: done getting next task for host managed-node3 41175 1727204635.71691: ^ task is: TASK: Include the task 'enable_epel.yml' 41175 1727204635.71694: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.71698: getting variables 41175 1727204635.71699: in VariableManager get_vars() 41175 1727204635.71725: Calling all_inventory to load vars for managed-node3 41175 1727204635.71728: Calling groups_inventory to load vars for managed-node3 41175 1727204635.71731: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.71742: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.71745: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.71748: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.72032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.72342: done with get_vars() 41175 1727204635.72355: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.025) 0:00:02.863 ***** 41175 1727204635.72455: entering _queue_task() for managed-node3/include_tasks 41175 1727204635.72676: worker is 1 (out of 1 available) 41175 1727204635.72691: exiting _queue_task() for managed-node3/include_tasks 41175 1727204635.72703: done queuing things up, now waiting for results queue to drain 41175 1727204635.72705: waiting for pending results... 41175 1727204635.72874: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 41175 1727204635.72951: in run() - task 12b410aa-8751-f070-39c4-00000000010c 41175 1727204635.72962: variable 'ansible_search_path' from source: unknown 41175 1727204635.72966: variable 'ansible_search_path' from source: unknown 41175 1727204635.72996: calling self._execute() 41175 1727204635.73061: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.73068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.73077: variable 'omit' from source: magic vars 41175 1727204635.73487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204635.75887: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204635.75954: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204635.75984: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204635.76016: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204635.76045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204635.76113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204635.76146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204635.76168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204635.76202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204635.76215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204635.76316: variable '__network_is_ostree' from source: set_fact 41175 1727204635.76334: Evaluated conditional (not __network_is_ostree | d(false)): True 41175 1727204635.76342: _execute() done 41175 1727204635.76345: dumping result to json 41175 1727204635.76350: done dumping result, returning 41175 1727204635.76358: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-f070-39c4-00000000010c] 41175 1727204635.76362: sending task result for task 12b410aa-8751-f070-39c4-00000000010c 41175 1727204635.76460: done sending task result for task 12b410aa-8751-f070-39c4-00000000010c 41175 1727204635.76463: WORKER PROCESS EXITING 41175 1727204635.76498: no more pending results, returning what we have 41175 1727204635.76503: in VariableManager get_vars() 41175 1727204635.76539: Calling all_inventory to load vars for managed-node3 41175 1727204635.76542: Calling groups_inventory to load vars for managed-node3 41175 1727204635.76546: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.76558: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.76561: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.76565: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.76765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.76991: done with get_vars() 41175 1727204635.77001: variable 'ansible_search_path' from source: unknown 41175 1727204635.77003: variable 'ansible_search_path' from source: unknown 41175 1727204635.77047: we have included files to process 41175 1727204635.77049: generating all_blocks data 41175 1727204635.77051: done generating all_blocks data 41175 1727204635.77056: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41175 1727204635.77058: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41175 1727204635.77061: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41175 1727204635.77925: done processing included file 41175 1727204635.77928: iterating over new_blocks loaded from include file 41175 1727204635.77930: in VariableManager get_vars() 41175 1727204635.77945: done with get_vars() 41175 1727204635.77947: filtering new block on tags 41175 1727204635.77976: done filtering new block on tags 41175 1727204635.77980: in VariableManager get_vars() 41175 1727204635.77994: done with get_vars() 41175 1727204635.77996: filtering new block on tags 41175 1727204635.78010: done filtering new block on tags 41175 1727204635.78012: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 41175 1727204635.78021: extending task lists for all hosts with included blocks 41175 1727204635.78158: done extending task lists 41175 1727204635.78159: done processing included files 41175 1727204635.78160: results queue empty 41175 1727204635.78161: checking for any_errors_fatal 41175 1727204635.78165: done checking for any_errors_fatal 41175 1727204635.78166: checking for max_fail_percentage 41175 1727204635.78167: done checking for max_fail_percentage 41175 1727204635.78168: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.78169: done checking to see if all hosts have failed 41175 1727204635.78170: getting the remaining hosts for this loop 41175 1727204635.78172: done getting the remaining hosts for this loop 41175 1727204635.78175: getting the next task for host managed-node3 41175 1727204635.78179: done getting next task for host managed-node3 41175 1727204635.78182: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 41175 1727204635.78185: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.78188: getting variables 41175 1727204635.78190: in VariableManager get_vars() 41175 1727204635.78200: Calling all_inventory to load vars for managed-node3 41175 1727204635.78203: Calling groups_inventory to load vars for managed-node3 41175 1727204635.78206: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.78212: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.78223: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.78227: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.78634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.78943: done with get_vars() 41175 1727204635.78954: done getting variables 41175 1727204635.79033: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 41175 1727204635.79274: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.068) 0:00:02.932 ***** 41175 1727204635.79333: entering _queue_task() for managed-node3/command 41175 1727204635.79336: Creating lock for command 41175 1727204635.79597: worker is 1 (out of 1 available) 41175 1727204635.79612: exiting _queue_task() for managed-node3/command 41175 1727204635.79627: done queuing things up, now waiting for results queue to drain 41175 1727204635.79629: waiting for pending results... 41175 1727204635.79823: running TaskExecutor() for managed-node3/TASK: Create EPEL 39 41175 1727204635.79904: in run() - task 12b410aa-8751-f070-39c4-000000000126 41175 1727204635.79927: variable 'ansible_search_path' from source: unknown 41175 1727204635.79932: variable 'ansible_search_path' from source: unknown 41175 1727204635.79955: calling self._execute() 41175 1727204635.80019: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.80023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.80036: variable 'omit' from source: magic vars 41175 1727204635.80359: variable 'ansible_distribution' from source: facts 41175 1727204635.80372: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 41175 1727204635.80376: when evaluation is False, skipping this task 41175 1727204635.80380: _execute() done 41175 1727204635.80383: dumping result to json 41175 1727204635.80386: done dumping result, returning 41175 1727204635.80393: done running TaskExecutor() for managed-node3/TASK: Create EPEL 39 [12b410aa-8751-f070-39c4-000000000126] 41175 1727204635.80400: sending task result for task 12b410aa-8751-f070-39c4-000000000126 41175 1727204635.80508: done sending task result for task 12b410aa-8751-f070-39c4-000000000126 41175 1727204635.80512: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 41175 1727204635.80575: no more pending results, returning what we have 41175 1727204635.80584: results queue empty 41175 1727204635.80585: checking for any_errors_fatal 41175 1727204635.80587: done checking for any_errors_fatal 41175 1727204635.80588: checking for max_fail_percentage 41175 1727204635.80591: done checking for max_fail_percentage 41175 1727204635.80592: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.80593: done checking to see if all hosts have failed 41175 1727204635.80594: getting the remaining hosts for this loop 41175 1727204635.80595: done getting the remaining hosts for this loop 41175 1727204635.80599: getting the next task for host managed-node3 41175 1727204635.80605: done getting next task for host managed-node3 41175 1727204635.80607: ^ task is: TASK: Install yum-utils package 41175 1727204635.80611: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.80614: getting variables 41175 1727204635.80615: in VariableManager get_vars() 41175 1727204635.80650: Calling all_inventory to load vars for managed-node3 41175 1727204635.80653: Calling groups_inventory to load vars for managed-node3 41175 1727204635.80656: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.80667: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.80670: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.80672: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.80832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.81036: done with get_vars() 41175 1727204635.81043: done getting variables 41175 1727204635.81123: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.018) 0:00:02.950 ***** 41175 1727204635.81146: entering _queue_task() for managed-node3/package 41175 1727204635.81148: Creating lock for package 41175 1727204635.81355: worker is 1 (out of 1 available) 41175 1727204635.81373: exiting _queue_task() for managed-node3/package 41175 1727204635.81385: done queuing things up, now waiting for results queue to drain 41175 1727204635.81387: waiting for pending results... 41175 1727204635.81719: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 41175 1727204635.81724: in run() - task 12b410aa-8751-f070-39c4-000000000127 41175 1727204635.81732: variable 'ansible_search_path' from source: unknown 41175 1727204635.81735: variable 'ansible_search_path' from source: unknown 41175 1727204635.81771: calling self._execute() 41175 1727204635.81849: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.81857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.81870: variable 'omit' from source: magic vars 41175 1727204635.82306: variable 'ansible_distribution' from source: facts 41175 1727204635.82361: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 41175 1727204635.82365: when evaluation is False, skipping this task 41175 1727204635.82368: _execute() done 41175 1727204635.82371: dumping result to json 41175 1727204635.82374: done dumping result, returning 41175 1727204635.82376: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [12b410aa-8751-f070-39c4-000000000127] 41175 1727204635.82378: sending task result for task 12b410aa-8751-f070-39c4-000000000127 41175 1727204635.82449: done sending task result for task 12b410aa-8751-f070-39c4-000000000127 41175 1727204635.82452: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 41175 1727204635.82504: no more pending results, returning what we have 41175 1727204635.82508: results queue empty 41175 1727204635.82509: checking for any_errors_fatal 41175 1727204635.82515: done checking for any_errors_fatal 41175 1727204635.82515: checking for max_fail_percentage 41175 1727204635.82519: done checking for max_fail_percentage 41175 1727204635.82520: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.82521: done checking to see if all hosts have failed 41175 1727204635.82522: getting the remaining hosts for this loop 41175 1727204635.82523: done getting the remaining hosts for this loop 41175 1727204635.82527: getting the next task for host managed-node3 41175 1727204635.82532: done getting next task for host managed-node3 41175 1727204635.82535: ^ task is: TASK: Enable EPEL 7 41175 1727204635.82538: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.82542: getting variables 41175 1727204635.82543: in VariableManager get_vars() 41175 1727204635.82568: Calling all_inventory to load vars for managed-node3 41175 1727204635.82571: Calling groups_inventory to load vars for managed-node3 41175 1727204635.82575: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.82585: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.82588: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.82595: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.82863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.83224: done with get_vars() 41175 1727204635.83244: done getting variables 41175 1727204635.83309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.021) 0:00:02.972 ***** 41175 1727204635.83343: entering _queue_task() for managed-node3/command 41175 1727204635.83612: worker is 1 (out of 1 available) 41175 1727204635.83628: exiting _queue_task() for managed-node3/command 41175 1727204635.83642: done queuing things up, now waiting for results queue to drain 41175 1727204635.83644: waiting for pending results... 41175 1727204635.84032: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 41175 1727204635.84037: in run() - task 12b410aa-8751-f070-39c4-000000000128 41175 1727204635.84039: variable 'ansible_search_path' from source: unknown 41175 1727204635.84042: variable 'ansible_search_path' from source: unknown 41175 1727204635.84045: calling self._execute() 41175 1727204635.84085: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.84095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.84126: variable 'omit' from source: magic vars 41175 1727204635.84532: variable 'ansible_distribution' from source: facts 41175 1727204635.84545: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 41175 1727204635.84555: when evaluation is False, skipping this task 41175 1727204635.84566: _execute() done 41175 1727204635.84569: dumping result to json 41175 1727204635.84572: done dumping result, returning 41175 1727204635.84575: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [12b410aa-8751-f070-39c4-000000000128] 41175 1727204635.84577: sending task result for task 12b410aa-8751-f070-39c4-000000000128 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 41175 1727204635.84815: no more pending results, returning what we have 41175 1727204635.84821: results queue empty 41175 1727204635.84822: checking for any_errors_fatal 41175 1727204635.84826: done checking for any_errors_fatal 41175 1727204635.84827: checking for max_fail_percentage 41175 1727204635.84829: done checking for max_fail_percentage 41175 1727204635.84830: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.84831: done checking to see if all hosts have failed 41175 1727204635.84832: getting the remaining hosts for this loop 41175 1727204635.84834: done getting the remaining hosts for this loop 41175 1727204635.84837: getting the next task for host managed-node3 41175 1727204635.84844: done getting next task for host managed-node3 41175 1727204635.84848: ^ task is: TASK: Enable EPEL 8 41175 1727204635.84853: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.84856: getting variables 41175 1727204635.84858: in VariableManager get_vars() 41175 1727204635.84887: Calling all_inventory to load vars for managed-node3 41175 1727204635.84892: Calling groups_inventory to load vars for managed-node3 41175 1727204635.84896: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.84907: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.84910: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.84914: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.85198: done sending task result for task 12b410aa-8751-f070-39c4-000000000128 41175 1727204635.85202: WORKER PROCESS EXITING 41175 1727204635.85237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.85543: done with get_vars() 41175 1727204635.85555: done getting variables 41175 1727204635.85619: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.023) 0:00:02.995 ***** 41175 1727204635.85658: entering _queue_task() for managed-node3/command 41175 1727204635.85910: worker is 1 (out of 1 available) 41175 1727204635.85925: exiting _queue_task() for managed-node3/command 41175 1727204635.85935: done queuing things up, now waiting for results queue to drain 41175 1727204635.85937: waiting for pending results... 41175 1727204635.86229: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 41175 1727204635.86357: in run() - task 12b410aa-8751-f070-39c4-000000000129 41175 1727204635.86376: variable 'ansible_search_path' from source: unknown 41175 1727204635.86384: variable 'ansible_search_path' from source: unknown 41175 1727204635.86428: calling self._execute() 41175 1727204635.86517: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.86532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.86551: variable 'omit' from source: magic vars 41175 1727204635.86967: variable 'ansible_distribution' from source: facts 41175 1727204635.86987: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 41175 1727204635.86999: when evaluation is False, skipping this task 41175 1727204635.87007: _execute() done 41175 1727204635.87015: dumping result to json 41175 1727204635.87025: done dumping result, returning 41175 1727204635.87036: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [12b410aa-8751-f070-39c4-000000000129] 41175 1727204635.87047: sending task result for task 12b410aa-8751-f070-39c4-000000000129 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 41175 1727204635.87310: no more pending results, returning what we have 41175 1727204635.87313: results queue empty 41175 1727204635.87314: checking for any_errors_fatal 41175 1727204635.87322: done checking for any_errors_fatal 41175 1727204635.87323: checking for max_fail_percentage 41175 1727204635.87324: done checking for max_fail_percentage 41175 1727204635.87325: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.87326: done checking to see if all hosts have failed 41175 1727204635.87327: getting the remaining hosts for this loop 41175 1727204635.87328: done getting the remaining hosts for this loop 41175 1727204635.87332: getting the next task for host managed-node3 41175 1727204635.87341: done getting next task for host managed-node3 41175 1727204635.87344: ^ task is: TASK: Enable EPEL 6 41175 1727204635.87348: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.87351: getting variables 41175 1727204635.87352: in VariableManager get_vars() 41175 1727204635.87377: Calling all_inventory to load vars for managed-node3 41175 1727204635.87380: Calling groups_inventory to load vars for managed-node3 41175 1727204635.87384: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.87398: done sending task result for task 12b410aa-8751-f070-39c4-000000000129 41175 1727204635.87402: WORKER PROCESS EXITING 41175 1727204635.87412: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.87416: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.87422: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.87685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.88037: done with get_vars() 41175 1727204635.88057: done getting variables 41175 1727204635.88123: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.024) 0:00:03.020 ***** 41175 1727204635.88161: entering _queue_task() for managed-node3/copy 41175 1727204635.88609: worker is 1 (out of 1 available) 41175 1727204635.88621: exiting _queue_task() for managed-node3/copy 41175 1727204635.88629: done queuing things up, now waiting for results queue to drain 41175 1727204635.88630: waiting for pending results... 41175 1727204635.88700: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 41175 1727204635.88835: in run() - task 12b410aa-8751-f070-39c4-00000000012b 41175 1727204635.88865: variable 'ansible_search_path' from source: unknown 41175 1727204635.88873: variable 'ansible_search_path' from source: unknown 41175 1727204635.88916: calling self._execute() 41175 1727204635.89019: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.89033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.89048: variable 'omit' from source: magic vars 41175 1727204635.89602: variable 'ansible_distribution' from source: facts 41175 1727204635.89632: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 41175 1727204635.89640: when evaluation is False, skipping this task 41175 1727204635.89648: _execute() done 41175 1727204635.89694: dumping result to json 41175 1727204635.89697: done dumping result, returning 41175 1727204635.89700: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [12b410aa-8751-f070-39c4-00000000012b] 41175 1727204635.89702: sending task result for task 12b410aa-8751-f070-39c4-00000000012b skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 41175 1727204635.89992: no more pending results, returning what we have 41175 1727204635.89997: results queue empty 41175 1727204635.89998: checking for any_errors_fatal 41175 1727204635.90004: done checking for any_errors_fatal 41175 1727204635.90005: checking for max_fail_percentage 41175 1727204635.90007: done checking for max_fail_percentage 41175 1727204635.90008: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.90009: done checking to see if all hosts have failed 41175 1727204635.90010: getting the remaining hosts for this loop 41175 1727204635.90012: done getting the remaining hosts for this loop 41175 1727204635.90016: getting the next task for host managed-node3 41175 1727204635.90029: done getting next task for host managed-node3 41175 1727204635.90032: ^ task is: TASK: Set network provider to 'nm' 41175 1727204635.90034: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.90038: getting variables 41175 1727204635.90040: in VariableManager get_vars() 41175 1727204635.90077: Calling all_inventory to load vars for managed-node3 41175 1727204635.90081: Calling groups_inventory to load vars for managed-node3 41175 1727204635.90085: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.90174: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.90178: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.90182: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.90196: done sending task result for task 12b410aa-8751-f070-39c4-00000000012b 41175 1727204635.90199: WORKER PROCESS EXITING 41175 1727204635.90528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.90872: done with get_vars() 41175 1727204635.90884: done getting variables 41175 1727204635.90961: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:13 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.028) 0:00:03.048 ***** 41175 1727204635.90992: entering _queue_task() for managed-node3/set_fact 41175 1727204635.91374: worker is 1 (out of 1 available) 41175 1727204635.91386: exiting _queue_task() for managed-node3/set_fact 41175 1727204635.91399: done queuing things up, now waiting for results queue to drain 41175 1727204635.91401: waiting for pending results... 41175 1727204635.91589: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 41175 1727204635.91705: in run() - task 12b410aa-8751-f070-39c4-000000000007 41175 1727204635.91727: variable 'ansible_search_path' from source: unknown 41175 1727204635.91771: calling self._execute() 41175 1727204635.91867: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.91879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.91898: variable 'omit' from source: magic vars 41175 1727204635.92059: variable 'omit' from source: magic vars 41175 1727204635.92106: variable 'omit' from source: magic vars 41175 1727204635.92177: variable 'omit' from source: magic vars 41175 1727204635.92249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204635.92358: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204635.92362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204635.92367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.92388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.92434: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204635.92445: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.92454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.92686: Set connection var ansible_shell_executable to /bin/sh 41175 1727204635.92691: Set connection var ansible_shell_type to sh 41175 1727204635.92694: Set connection var ansible_pipelining to False 41175 1727204635.92696: Set connection var ansible_timeout to 10 41175 1727204635.92699: Set connection var ansible_connection to ssh 41175 1727204635.92701: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204635.92706: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.92715: variable 'ansible_connection' from source: unknown 41175 1727204635.92731: variable 'ansible_module_compression' from source: unknown 41175 1727204635.92739: variable 'ansible_shell_type' from source: unknown 41175 1727204635.92747: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.92757: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.92795: variable 'ansible_pipelining' from source: unknown 41175 1727204635.92800: variable 'ansible_timeout' from source: unknown 41175 1727204635.92803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.92995: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204635.93026: variable 'omit' from source: magic vars 41175 1727204635.93051: starting attempt loop 41175 1727204635.93055: running the handler 41175 1727204635.93116: handler run complete 41175 1727204635.93127: attempt loop complete, returning result 41175 1727204635.93130: _execute() done 41175 1727204635.93133: dumping result to json 41175 1727204635.93135: done dumping result, returning 41175 1727204635.93137: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [12b410aa-8751-f070-39c4-000000000007] 41175 1727204635.93139: sending task result for task 12b410aa-8751-f070-39c4-000000000007 41175 1727204635.93298: done sending task result for task 12b410aa-8751-f070-39c4-000000000007 41175 1727204635.93302: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 41175 1727204635.93484: no more pending results, returning what we have 41175 1727204635.93488: results queue empty 41175 1727204635.93491: checking for any_errors_fatal 41175 1727204635.93498: done checking for any_errors_fatal 41175 1727204635.93499: checking for max_fail_percentage 41175 1727204635.93500: done checking for max_fail_percentage 41175 1727204635.93501: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.93503: done checking to see if all hosts have failed 41175 1727204635.93504: getting the remaining hosts for this loop 41175 1727204635.93506: done getting the remaining hosts for this loop 41175 1727204635.93510: getting the next task for host managed-node3 41175 1727204635.93522: done getting next task for host managed-node3 41175 1727204635.93525: ^ task is: TASK: meta (flush_handlers) 41175 1727204635.93527: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.93532: getting variables 41175 1727204635.93534: in VariableManager get_vars() 41175 1727204635.93567: Calling all_inventory to load vars for managed-node3 41175 1727204635.93570: Calling groups_inventory to load vars for managed-node3 41175 1727204635.93574: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.93588: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.93697: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.93709: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.93982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.94381: done with get_vars() 41175 1727204635.94397: done getting variables 41175 1727204635.94483: in VariableManager get_vars() 41175 1727204635.94496: Calling all_inventory to load vars for managed-node3 41175 1727204635.94499: Calling groups_inventory to load vars for managed-node3 41175 1727204635.94503: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.94508: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.94511: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.94515: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.94751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.95086: done with get_vars() 41175 1727204635.95105: done queuing things up, now waiting for results queue to drain 41175 1727204635.95107: results queue empty 41175 1727204635.95109: checking for any_errors_fatal 41175 1727204635.95111: done checking for any_errors_fatal 41175 1727204635.95113: checking for max_fail_percentage 41175 1727204635.95114: done checking for max_fail_percentage 41175 1727204635.95115: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.95116: done checking to see if all hosts have failed 41175 1727204635.95123: getting the remaining hosts for this loop 41175 1727204635.95125: done getting the remaining hosts for this loop 41175 1727204635.95128: getting the next task for host managed-node3 41175 1727204635.95136: done getting next task for host managed-node3 41175 1727204635.95139: ^ task is: TASK: meta (flush_handlers) 41175 1727204635.95140: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.95150: getting variables 41175 1727204635.95152: in VariableManager get_vars() 41175 1727204635.95162: Calling all_inventory to load vars for managed-node3 41175 1727204635.95165: Calling groups_inventory to load vars for managed-node3 41175 1727204635.95168: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.95173: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.95177: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.95180: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.95361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.95616: done with get_vars() 41175 1727204635.95625: done getting variables 41175 1727204635.95677: in VariableManager get_vars() 41175 1727204635.95686: Calling all_inventory to load vars for managed-node3 41175 1727204635.95688: Calling groups_inventory to load vars for managed-node3 41175 1727204635.95693: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.95698: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.95701: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.95704: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.95906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.96204: done with get_vars() 41175 1727204635.96218: done queuing things up, now waiting for results queue to drain 41175 1727204635.96220: results queue empty 41175 1727204635.96221: checking for any_errors_fatal 41175 1727204635.96222: done checking for any_errors_fatal 41175 1727204635.96223: checking for max_fail_percentage 41175 1727204635.96224: done checking for max_fail_percentage 41175 1727204635.96225: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.96226: done checking to see if all hosts have failed 41175 1727204635.96227: getting the remaining hosts for this loop 41175 1727204635.96228: done getting the remaining hosts for this loop 41175 1727204635.96231: getting the next task for host managed-node3 41175 1727204635.96234: done getting next task for host managed-node3 41175 1727204635.96235: ^ task is: None 41175 1727204635.96237: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.96239: done queuing things up, now waiting for results queue to drain 41175 1727204635.96240: results queue empty 41175 1727204635.96241: checking for any_errors_fatal 41175 1727204635.96241: done checking for any_errors_fatal 41175 1727204635.96242: checking for max_fail_percentage 41175 1727204635.96243: done checking for max_fail_percentage 41175 1727204635.96244: checking to see if all hosts have failed and the running result is not ok 41175 1727204635.96245: done checking to see if all hosts have failed 41175 1727204635.96247: getting the next task for host managed-node3 41175 1727204635.96250: done getting next task for host managed-node3 41175 1727204635.96251: ^ task is: None 41175 1727204635.96253: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.96299: in VariableManager get_vars() 41175 1727204635.96327: done with get_vars() 41175 1727204635.96336: in VariableManager get_vars() 41175 1727204635.96357: done with get_vars() 41175 1727204635.96364: variable 'omit' from source: magic vars 41175 1727204635.96404: in VariableManager get_vars() 41175 1727204635.96426: done with get_vars() 41175 1727204635.96454: variable 'omit' from source: magic vars PLAY [Play for testing route table] ******************************************** 41175 1727204635.96830: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41175 1727204635.96854: getting the remaining hosts for this loop 41175 1727204635.96855: done getting the remaining hosts for this loop 41175 1727204635.96857: getting the next task for host managed-node3 41175 1727204635.96859: done getting next task for host managed-node3 41175 1727204635.96861: ^ task is: TASK: Gathering Facts 41175 1727204635.96862: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204635.96863: getting variables 41175 1727204635.96864: in VariableManager get_vars() 41175 1727204635.96875: Calling all_inventory to load vars for managed-node3 41175 1727204635.96876: Calling groups_inventory to load vars for managed-node3 41175 1727204635.96880: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204635.96885: Calling all_plugins_play to load vars for managed-node3 41175 1727204635.96901: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204635.96903: Calling groups_plugins_play to load vars for managed-node3 41175 1727204635.97055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204635.97337: done with get_vars() 41175 1727204635.97347: done getting variables 41175 1727204635.97395: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 Tuesday 24 September 2024 15:03:55 -0400 (0:00:00.064) 0:00:03.113 ***** 41175 1727204635.97421: entering _queue_task() for managed-node3/gather_facts 41175 1727204635.97743: worker is 1 (out of 1 available) 41175 1727204635.97755: exiting _queue_task() for managed-node3/gather_facts 41175 1727204635.97764: done queuing things up, now waiting for results queue to drain 41175 1727204635.97766: waiting for pending results... 41175 1727204635.98153: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41175 1727204635.98158: in run() - task 12b410aa-8751-f070-39c4-000000000151 41175 1727204635.98162: variable 'ansible_search_path' from source: unknown 41175 1727204635.98231: calling self._execute() 41175 1727204635.98324: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.98331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.98340: variable 'omit' from source: magic vars 41175 1727204635.98720: variable 'ansible_distribution_major_version' from source: facts 41175 1727204635.98895: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204635.98899: variable 'omit' from source: magic vars 41175 1727204635.98901: variable 'omit' from source: magic vars 41175 1727204635.98996: variable 'omit' from source: magic vars 41175 1727204635.98999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204635.99394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204635.99398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204635.99400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.99402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204635.99404: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204635.99406: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.99408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.99492: Set connection var ansible_shell_executable to /bin/sh 41175 1727204635.99502: Set connection var ansible_shell_type to sh 41175 1727204635.99516: Set connection var ansible_pipelining to False 41175 1727204635.99540: Set connection var ansible_timeout to 10 41175 1727204635.99553: Set connection var ansible_connection to ssh 41175 1727204635.99564: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204635.99603: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.99612: variable 'ansible_connection' from source: unknown 41175 1727204635.99623: variable 'ansible_module_compression' from source: unknown 41175 1727204635.99638: variable 'ansible_shell_type' from source: unknown 41175 1727204635.99647: variable 'ansible_shell_executable' from source: unknown 41175 1727204635.99656: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204635.99665: variable 'ansible_pipelining' from source: unknown 41175 1727204635.99673: variable 'ansible_timeout' from source: unknown 41175 1727204635.99682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204635.99929: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204635.99951: variable 'omit' from source: magic vars 41175 1727204635.99968: starting attempt loop 41175 1727204635.99977: running the handler 41175 1727204636.00003: variable 'ansible_facts' from source: unknown 41175 1727204636.00043: _low_level_execute_command(): starting 41175 1727204636.00057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204636.01712: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204636.01802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204636.01836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204636.01931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 41175 1727204636.04341: stdout chunk (state=3): >>>/root <<< 41175 1727204636.04597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204636.04601: stdout chunk (state=3): >>><<< 41175 1727204636.04604: stderr chunk (state=3): >>><<< 41175 1727204636.04697: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 41175 1727204636.04701: _low_level_execute_command(): starting 41175 1727204636.04704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212 `" && echo ansible-tmp-1727204636.0466928-41351-201909915601212="` echo /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212 `" ) && sleep 0' 41175 1727204636.05901: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204636.05906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204636.05909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204636.05918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204636.06014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204636.06157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204636.06182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41175 1727204636.09059: stdout chunk (state=3): >>>ansible-tmp-1727204636.0466928-41351-201909915601212=/root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212 <<< 41175 1727204636.09598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204636.09601: stdout chunk (state=3): >>><<< 41175 1727204636.09604: stderr chunk (state=3): >>><<< 41175 1727204636.09606: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204636.0466928-41351-201909915601212=/root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41175 1727204636.09609: variable 'ansible_module_compression' from source: unknown 41175 1727204636.09697: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41175 1727204636.09748: variable 'ansible_facts' from source: unknown 41175 1727204636.10497: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/AnsiballZ_setup.py 41175 1727204636.10852: Sending initial data 41175 1727204636.10941: Sent initial data (154 bytes) 41175 1727204636.12310: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204636.12516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204636.12597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204636.12648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204636.12735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204636.14415: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204636.14458: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204636.14515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp0__p_x31 /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/AnsiballZ_setup.py <<< 41175 1727204636.14521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/AnsiballZ_setup.py" <<< 41175 1727204636.14556: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp0__p_x31" to remote "/root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/AnsiballZ_setup.py" <<< 41175 1727204636.17146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204636.17196: stderr chunk (state=3): >>><<< 41175 1727204636.17209: stdout chunk (state=3): >>><<< 41175 1727204636.17267: done transferring module to remote 41175 1727204636.17285: _low_level_execute_command(): starting 41175 1727204636.17298: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/ /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/AnsiballZ_setup.py && sleep 0' 41175 1727204636.18050: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204636.18179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204636.18206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204636.18296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204636.20305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204636.20308: stdout chunk (state=3): >>><<< 41175 1727204636.20311: stderr chunk (state=3): >>><<< 41175 1727204636.20402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204636.20413: _low_level_execute_command(): starting 41175 1727204636.20416: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/AnsiballZ_setup.py && sleep 0' 41175 1727204636.21061: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204636.21078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204636.21095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204636.21116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204636.21169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204636.21182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204636.21282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204636.21301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204636.21325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204636.21343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204636.21424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204637.13086: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core":<<< 41175 1727204637.13135: stdout chunk (state=3): >>> 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2830, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 887, "free": 2830}, "nocache": {"free": 3460, "used": 257}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1141, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148750848, "block_size": 4096, "block_total": 64479564, "block_available": 61315613, "block_used": 3163951, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.9990234375, "5m": 0.84326171875, "15m": 0.5166015625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", <<< 41175 1727204637.13170: stdout chunk (state=3): >>>"rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "57", "epoch": "1727204637", "epoch_int": "1727204637", "date": "2024-09-24", "time": "15:03:57", "iso8601_micro": "2024-09-24T19:03:57.122889Z", "iso8601": "2024-09-24T19:03:57Z", "iso8601_basic": "20240924T150357122889", "iso8601_basic_short": "20240924T150357", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41175 1727204637.16309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204637.16335: stderr chunk (state=3): >>><<< 41175 1727204637.16338: stdout chunk (state=3): >>><<< 41175 1727204637.16377: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2830, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 887, "free": 2830}, "nocache": {"free": 3460, "used": 257}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1141, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148750848, "block_size": 4096, "block_total": 64479564, "block_available": 61315613, "block_used": 3163951, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.9990234375, "5m": 0.84326171875, "15m": 0.5166015625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "57", "epoch": "1727204637", "epoch_int": "1727204637", "date": "2024-09-24", "time": "15:03:57", "iso8601_micro": "2024-09-24T19:03:57.122889Z", "iso8601": "2024-09-24T19:03:57Z", "iso8601_basic": "20240924T150357122889", "iso8601_basic_short": "20240924T150357", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204637.17863: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204637.17870: _low_level_execute_command(): starting 41175 1727204637.17882: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204636.0466928-41351-201909915601212/ > /dev/null 2>&1 && sleep 0' 41175 1727204637.19225: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204637.19241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204637.19253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204637.19272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204637.19371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204637.19641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204637.19679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204637.22614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204637.22618: stdout chunk (state=3): >>><<< 41175 1727204637.22621: stderr chunk (state=3): >>><<< 41175 1727204637.22648: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204637.22673: handler run complete 41175 1727204637.22926: variable 'ansible_facts' from source: unknown 41175 1727204637.23106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.23641: variable 'ansible_facts' from source: unknown 41175 1727204637.23963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.24203: attempt loop complete, returning result 41175 1727204637.24214: _execute() done 41175 1727204637.24222: dumping result to json 41175 1727204637.24266: done dumping result, returning 41175 1727204637.24288: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-f070-39c4-000000000151] 41175 1727204637.24303: sending task result for task 12b410aa-8751-f070-39c4-000000000151 ok: [managed-node3] 41175 1727204637.25564: no more pending results, returning what we have 41175 1727204637.25567: results queue empty 41175 1727204637.25568: checking for any_errors_fatal 41175 1727204637.25570: done checking for any_errors_fatal 41175 1727204637.25571: checking for max_fail_percentage 41175 1727204637.25573: done checking for max_fail_percentage 41175 1727204637.25574: checking to see if all hosts have failed and the running result is not ok 41175 1727204637.25575: done checking to see if all hosts have failed 41175 1727204637.25576: getting the remaining hosts for this loop 41175 1727204637.25578: done getting the remaining hosts for this loop 41175 1727204637.25582: getting the next task for host managed-node3 41175 1727204637.25588: done getting next task for host managed-node3 41175 1727204637.25598: ^ task is: TASK: meta (flush_handlers) 41175 1727204637.25600: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204637.25604: getting variables 41175 1727204637.25606: in VariableManager get_vars() 41175 1727204637.25642: Calling all_inventory to load vars for managed-node3 41175 1727204637.25646: Calling groups_inventory to load vars for managed-node3 41175 1727204637.25649: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.25657: done sending task result for task 12b410aa-8751-f070-39c4-000000000151 41175 1727204637.25660: WORKER PROCESS EXITING 41175 1727204637.25672: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.25675: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.25679: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.25951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.26311: done with get_vars() 41175 1727204637.26325: done getting variables 41175 1727204637.26414: in VariableManager get_vars() 41175 1727204637.26433: Calling all_inventory to load vars for managed-node3 41175 1727204637.26436: Calling groups_inventory to load vars for managed-node3 41175 1727204637.26439: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.26445: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.26448: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.26452: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.26689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.27032: done with get_vars() 41175 1727204637.27050: done queuing things up, now waiting for results queue to drain 41175 1727204637.27052: results queue empty 41175 1727204637.27053: checking for any_errors_fatal 41175 1727204637.27058: done checking for any_errors_fatal 41175 1727204637.27059: checking for max_fail_percentage 41175 1727204637.27061: done checking for max_fail_percentage 41175 1727204637.27062: checking to see if all hosts have failed and the running result is not ok 41175 1727204637.27068: done checking to see if all hosts have failed 41175 1727204637.27069: getting the remaining hosts for this loop 41175 1727204637.27070: done getting the remaining hosts for this loop 41175 1727204637.27073: getting the next task for host managed-node3 41175 1727204637.27078: done getting next task for host managed-node3 41175 1727204637.27080: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 41175 1727204637.27082: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204637.27085: getting variables 41175 1727204637.27086: in VariableManager get_vars() 41175 1727204637.27106: Calling all_inventory to load vars for managed-node3 41175 1727204637.27109: Calling groups_inventory to load vars for managed-node3 41175 1727204637.27112: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.27123: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.27131: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.27135: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.27369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.27711: done with get_vars() 41175 1727204637.27723: done getting variables 41175 1727204637.27785: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204637.27962: variable 'type' from source: play vars 41175 1727204637.27968: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:11 Tuesday 24 September 2024 15:03:57 -0400 (0:00:01.305) 0:00:04.419 ***** 41175 1727204637.28022: entering _queue_task() for managed-node3/set_fact 41175 1727204637.28475: worker is 1 (out of 1 available) 41175 1727204637.28491: exiting _queue_task() for managed-node3/set_fact 41175 1727204637.28503: done queuing things up, now waiting for results queue to drain 41175 1727204637.28505: waiting for pending results... 41175 1727204637.28702: running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 41175 1727204637.28827: in run() - task 12b410aa-8751-f070-39c4-00000000000b 41175 1727204637.28853: variable 'ansible_search_path' from source: unknown 41175 1727204637.28907: calling self._execute() 41175 1727204637.29149: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.29170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.29186: variable 'omit' from source: magic vars 41175 1727204637.29641: variable 'ansible_distribution_major_version' from source: facts 41175 1727204637.29660: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204637.29673: variable 'omit' from source: magic vars 41175 1727204637.29701: variable 'omit' from source: magic vars 41175 1727204637.29754: variable 'type' from source: play vars 41175 1727204637.29850: variable 'type' from source: play vars 41175 1727204637.29895: variable 'interface' from source: play vars 41175 1727204637.29944: variable 'interface' from source: play vars 41175 1727204637.29972: variable 'omit' from source: magic vars 41175 1727204637.30023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204637.30087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204637.30157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204637.30161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204637.30179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204637.30225: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204637.30236: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.30245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.30485: Set connection var ansible_shell_executable to /bin/sh 41175 1727204637.30491: Set connection var ansible_shell_type to sh 41175 1727204637.30493: Set connection var ansible_pipelining to False 41175 1727204637.30496: Set connection var ansible_timeout to 10 41175 1727204637.30498: Set connection var ansible_connection to ssh 41175 1727204637.30501: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204637.30504: variable 'ansible_shell_executable' from source: unknown 41175 1727204637.30506: variable 'ansible_connection' from source: unknown 41175 1727204637.30517: variable 'ansible_module_compression' from source: unknown 41175 1727204637.30526: variable 'ansible_shell_type' from source: unknown 41175 1727204637.30534: variable 'ansible_shell_executable' from source: unknown 41175 1727204637.30542: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.30551: variable 'ansible_pipelining' from source: unknown 41175 1727204637.30597: variable 'ansible_timeout' from source: unknown 41175 1727204637.30600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.30775: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204637.30796: variable 'omit' from source: magic vars 41175 1727204637.30817: starting attempt loop 41175 1727204637.30839: running the handler 41175 1727204637.30854: handler run complete 41175 1727204637.30915: attempt loop complete, returning result 41175 1727204637.30918: _execute() done 41175 1727204637.30923: dumping result to json 41175 1727204637.30925: done dumping result, returning 41175 1727204637.30928: done running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 [12b410aa-8751-f070-39c4-00000000000b] 41175 1727204637.30930: sending task result for task 12b410aa-8751-f070-39c4-00000000000b 41175 1727204637.31108: done sending task result for task 12b410aa-8751-f070-39c4-00000000000b 41175 1727204637.31112: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 41175 1727204637.31413: no more pending results, returning what we have 41175 1727204637.31416: results queue empty 41175 1727204637.31417: checking for any_errors_fatal 41175 1727204637.31419: done checking for any_errors_fatal 41175 1727204637.31420: checking for max_fail_percentage 41175 1727204637.31422: done checking for max_fail_percentage 41175 1727204637.31423: checking to see if all hosts have failed and the running result is not ok 41175 1727204637.31425: done checking to see if all hosts have failed 41175 1727204637.31426: getting the remaining hosts for this loop 41175 1727204637.31428: done getting the remaining hosts for this loop 41175 1727204637.31433: getting the next task for host managed-node3 41175 1727204637.31440: done getting next task for host managed-node3 41175 1727204637.31443: ^ task is: TASK: Include the task 'show_interfaces.yml' 41175 1727204637.31445: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204637.31448: getting variables 41175 1727204637.31450: in VariableManager get_vars() 41175 1727204637.31608: Calling all_inventory to load vars for managed-node3 41175 1727204637.31612: Calling groups_inventory to load vars for managed-node3 41175 1727204637.31615: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.31627: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.31631: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.31635: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.31899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.32227: done with get_vars() 41175 1727204637.32240: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:15 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.043) 0:00:04.462 ***** 41175 1727204637.32353: entering _queue_task() for managed-node3/include_tasks 41175 1727204637.32657: worker is 1 (out of 1 available) 41175 1727204637.32671: exiting _queue_task() for managed-node3/include_tasks 41175 1727204637.32685: done queuing things up, now waiting for results queue to drain 41175 1727204637.32687: waiting for pending results... 41175 1727204637.33107: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 41175 1727204637.33114: in run() - task 12b410aa-8751-f070-39c4-00000000000c 41175 1727204637.33127: variable 'ansible_search_path' from source: unknown 41175 1727204637.33183: calling self._execute() 41175 1727204637.33286: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.33302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.33316: variable 'omit' from source: magic vars 41175 1727204637.33795: variable 'ansible_distribution_major_version' from source: facts 41175 1727204637.33799: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204637.33801: _execute() done 41175 1727204637.33804: dumping result to json 41175 1727204637.33807: done dumping result, returning 41175 1727204637.33810: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-f070-39c4-00000000000c] 41175 1727204637.33818: sending task result for task 12b410aa-8751-f070-39c4-00000000000c 41175 1727204637.34023: done sending task result for task 12b410aa-8751-f070-39c4-00000000000c 41175 1727204637.34027: WORKER PROCESS EXITING 41175 1727204637.34059: no more pending results, returning what we have 41175 1727204637.34065: in VariableManager get_vars() 41175 1727204637.34123: Calling all_inventory to load vars for managed-node3 41175 1727204637.34127: Calling groups_inventory to load vars for managed-node3 41175 1727204637.34130: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.34152: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.34156: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.34160: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.34586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.34924: done with get_vars() 41175 1727204637.34933: variable 'ansible_search_path' from source: unknown 41175 1727204637.34947: we have included files to process 41175 1727204637.34953: generating all_blocks data 41175 1727204637.34956: done generating all_blocks data 41175 1727204637.34957: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41175 1727204637.34958: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41175 1727204637.34961: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41175 1727204637.35153: in VariableManager get_vars() 41175 1727204637.35184: done with get_vars() 41175 1727204637.35327: done processing included file 41175 1727204637.35329: iterating over new_blocks loaded from include file 41175 1727204637.35331: in VariableManager get_vars() 41175 1727204637.35352: done with get_vars() 41175 1727204637.35354: filtering new block on tags 41175 1727204637.35375: done filtering new block on tags 41175 1727204637.35377: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 41175 1727204637.35382: extending task lists for all hosts with included blocks 41175 1727204637.38413: done extending task lists 41175 1727204637.38415: done processing included files 41175 1727204637.38416: results queue empty 41175 1727204637.38417: checking for any_errors_fatal 41175 1727204637.38420: done checking for any_errors_fatal 41175 1727204637.38421: checking for max_fail_percentage 41175 1727204637.38423: done checking for max_fail_percentage 41175 1727204637.38423: checking to see if all hosts have failed and the running result is not ok 41175 1727204637.38424: done checking to see if all hosts have failed 41175 1727204637.38425: getting the remaining hosts for this loop 41175 1727204637.38426: done getting the remaining hosts for this loop 41175 1727204637.38429: getting the next task for host managed-node3 41175 1727204637.38433: done getting next task for host managed-node3 41175 1727204637.38435: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41175 1727204637.38438: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204637.38440: getting variables 41175 1727204637.38441: in VariableManager get_vars() 41175 1727204637.38456: Calling all_inventory to load vars for managed-node3 41175 1727204637.38458: Calling groups_inventory to load vars for managed-node3 41175 1727204637.38460: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.38466: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.38468: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.38471: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.38681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.39023: done with get_vars() 41175 1727204637.39038: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.067) 0:00:04.530 ***** 41175 1727204637.39123: entering _queue_task() for managed-node3/include_tasks 41175 1727204637.39435: worker is 1 (out of 1 available) 41175 1727204637.39449: exiting _queue_task() for managed-node3/include_tasks 41175 1727204637.39704: done queuing things up, now waiting for results queue to drain 41175 1727204637.39707: waiting for pending results... 41175 1727204637.39825: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 41175 1727204637.39854: in run() - task 12b410aa-8751-f070-39c4-000000000169 41175 1727204637.39874: variable 'ansible_search_path' from source: unknown 41175 1727204637.39882: variable 'ansible_search_path' from source: unknown 41175 1727204637.39933: calling self._execute() 41175 1727204637.40136: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.40139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.40142: variable 'omit' from source: magic vars 41175 1727204637.40869: variable 'ansible_distribution_major_version' from source: facts 41175 1727204637.40970: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204637.40974: _execute() done 41175 1727204637.40977: dumping result to json 41175 1727204637.40979: done dumping result, returning 41175 1727204637.40982: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-f070-39c4-000000000169] 41175 1727204637.40985: sending task result for task 12b410aa-8751-f070-39c4-000000000169 41175 1727204637.41060: done sending task result for task 12b410aa-8751-f070-39c4-000000000169 41175 1727204637.41063: WORKER PROCESS EXITING 41175 1727204637.41103: no more pending results, returning what we have 41175 1727204637.41114: in VariableManager get_vars() 41175 1727204637.41162: Calling all_inventory to load vars for managed-node3 41175 1727204637.41166: Calling groups_inventory to load vars for managed-node3 41175 1727204637.41169: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.41186: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.41192: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.41196: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.41931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.42264: done with get_vars() 41175 1727204637.42272: variable 'ansible_search_path' from source: unknown 41175 1727204637.42273: variable 'ansible_search_path' from source: unknown 41175 1727204637.42318: we have included files to process 41175 1727204637.42319: generating all_blocks data 41175 1727204637.42321: done generating all_blocks data 41175 1727204637.42322: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41175 1727204637.42323: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41175 1727204637.42326: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41175 1727204637.42717: done processing included file 41175 1727204637.42720: iterating over new_blocks loaded from include file 41175 1727204637.42722: in VariableManager get_vars() 41175 1727204637.42743: done with get_vars() 41175 1727204637.42745: filtering new block on tags 41175 1727204637.42767: done filtering new block on tags 41175 1727204637.42770: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 41175 1727204637.42775: extending task lists for all hosts with included blocks 41175 1727204637.42921: done extending task lists 41175 1727204637.42922: done processing included files 41175 1727204637.42923: results queue empty 41175 1727204637.42924: checking for any_errors_fatal 41175 1727204637.42928: done checking for any_errors_fatal 41175 1727204637.42929: checking for max_fail_percentage 41175 1727204637.42930: done checking for max_fail_percentage 41175 1727204637.42931: checking to see if all hosts have failed and the running result is not ok 41175 1727204637.42932: done checking to see if all hosts have failed 41175 1727204637.42933: getting the remaining hosts for this loop 41175 1727204637.42934: done getting the remaining hosts for this loop 41175 1727204637.42937: getting the next task for host managed-node3 41175 1727204637.42942: done getting next task for host managed-node3 41175 1727204637.42944: ^ task is: TASK: Gather current interface info 41175 1727204637.42947: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204637.42950: getting variables 41175 1727204637.42951: in VariableManager get_vars() 41175 1727204637.42966: Calling all_inventory to load vars for managed-node3 41175 1727204637.42968: Calling groups_inventory to load vars for managed-node3 41175 1727204637.42971: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204637.42976: Calling all_plugins_play to load vars for managed-node3 41175 1727204637.42980: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204637.42983: Calling groups_plugins_play to load vars for managed-node3 41175 1727204637.43243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204637.43580: done with get_vars() 41175 1727204637.43593: done getting variables 41175 1727204637.43638: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.045) 0:00:04.575 ***** 41175 1727204637.43677: entering _queue_task() for managed-node3/command 41175 1727204637.43956: worker is 1 (out of 1 available) 41175 1727204637.43969: exiting _queue_task() for managed-node3/command 41175 1727204637.44098: done queuing things up, now waiting for results queue to drain 41175 1727204637.44100: waiting for pending results... 41175 1727204637.44332: running TaskExecutor() for managed-node3/TASK: Gather current interface info 41175 1727204637.44428: in run() - task 12b410aa-8751-f070-39c4-00000000024e 41175 1727204637.44432: variable 'ansible_search_path' from source: unknown 41175 1727204637.44434: variable 'ansible_search_path' from source: unknown 41175 1727204637.44478: calling self._execute() 41175 1727204637.44648: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.44652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.44655: variable 'omit' from source: magic vars 41175 1727204637.45133: variable 'ansible_distribution_major_version' from source: facts 41175 1727204637.45155: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204637.45168: variable 'omit' from source: magic vars 41175 1727204637.45243: variable 'omit' from source: magic vars 41175 1727204637.45300: variable 'omit' from source: magic vars 41175 1727204637.45358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204637.45595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204637.45598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204637.45601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204637.45604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204637.45606: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204637.45609: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.45611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.45690: Set connection var ansible_shell_executable to /bin/sh 41175 1727204637.45701: Set connection var ansible_shell_type to sh 41175 1727204637.45715: Set connection var ansible_pipelining to False 41175 1727204637.45741: Set connection var ansible_timeout to 10 41175 1727204637.45754: Set connection var ansible_connection to ssh 41175 1727204637.45767: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204637.45799: variable 'ansible_shell_executable' from source: unknown 41175 1727204637.45809: variable 'ansible_connection' from source: unknown 41175 1727204637.45819: variable 'ansible_module_compression' from source: unknown 41175 1727204637.45827: variable 'ansible_shell_type' from source: unknown 41175 1727204637.45848: variable 'ansible_shell_executable' from source: unknown 41175 1727204637.45950: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204637.45954: variable 'ansible_pipelining' from source: unknown 41175 1727204637.45957: variable 'ansible_timeout' from source: unknown 41175 1727204637.45960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204637.46077: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204637.46098: variable 'omit' from source: magic vars 41175 1727204637.46167: starting attempt loop 41175 1727204637.46171: running the handler 41175 1727204637.46174: _low_level_execute_command(): starting 41175 1727204637.46176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204637.47069: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204637.47130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204637.47213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204637.49649: stdout chunk (state=3): >>>/root <<< 41175 1727204637.49896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204637.49917: stdout chunk (state=3): >>><<< 41175 1727204637.49931: stderr chunk (state=3): >>><<< 41175 1727204637.49960: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204637.49981: _low_level_execute_command(): starting 41175 1727204637.49995: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263 `" && echo ansible-tmp-1727204637.4996724-41393-92643068193263="` echo /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263 `" ) && sleep 0' 41175 1727204637.50676: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204637.50679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204637.50682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204637.50684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204637.50741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204637.50781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204637.50835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204637.53648: stdout chunk (state=3): >>>ansible-tmp-1727204637.4996724-41393-92643068193263=/root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263 <<< 41175 1727204637.53908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204637.53959: stderr chunk (state=3): >>><<< 41175 1727204637.53980: stdout chunk (state=3): >>><<< 41175 1727204637.54097: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204637.4996724-41393-92643068193263=/root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204637.54102: variable 'ansible_module_compression' from source: unknown 41175 1727204637.54139: ANSIBALLZ: Using generic lock for ansible.legacy.command 41175 1727204637.54149: ANSIBALLZ: Acquiring lock 41175 1727204637.54157: ANSIBALLZ: Lock acquired: 140088839296144 41175 1727204637.54167: ANSIBALLZ: Creating module 41175 1727204637.73614: ANSIBALLZ: Writing module into payload 41175 1727204637.73753: ANSIBALLZ: Writing module 41175 1727204637.73784: ANSIBALLZ: Renaming module 41175 1727204637.73896: ANSIBALLZ: Done creating module 41175 1727204637.73899: variable 'ansible_facts' from source: unknown 41175 1727204637.73929: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/AnsiballZ_command.py 41175 1727204637.74114: Sending initial data 41175 1727204637.74245: Sent initial data (155 bytes) 41175 1727204637.74894: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204637.74997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204637.75019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204637.75043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204637.75063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204637.75086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204637.75176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204637.77530: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204637.77574: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204637.77621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmps_69mdga /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/AnsiballZ_command.py <<< 41175 1727204637.77625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/AnsiballZ_command.py" <<< 41175 1727204637.77671: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmps_69mdga" to remote "/root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/AnsiballZ_command.py" <<< 41175 1727204637.78795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204637.78889: stderr chunk (state=3): >>><<< 41175 1727204637.78906: stdout chunk (state=3): >>><<< 41175 1727204637.79040: done transferring module to remote 41175 1727204637.79044: _low_level_execute_command(): starting 41175 1727204637.79046: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/ /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/AnsiballZ_command.py && sleep 0' 41175 1727204637.79613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204637.79639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204637.79657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204637.79677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204637.79747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204637.79794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204637.79813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204637.79849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204637.79976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204637.82709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204637.82762: stderr chunk (state=3): >>><<< 41175 1727204637.82779: stdout chunk (state=3): >>><<< 41175 1727204637.82895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204637.82899: _low_level_execute_command(): starting 41175 1727204637.82905: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/AnsiballZ_command.py && sleep 0' 41175 1727204637.83456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204637.83470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204637.83483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204637.83592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204637.83606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204637.83801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.11709: stdout chunk (state=3): >>> <<< 41175 1727204638.11734: stdout chunk (state=3): >>>{"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:58.110787", "end": "2024-09-24 15:03:58.115830", "delta": "0:00:00.005043", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204638.14124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204638.14235: stderr chunk (state=3): >>><<< 41175 1727204638.14256: stdout chunk (state=3): >>><<< 41175 1727204638.14278: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:58.110787", "end": "2024-09-24 15:03:58.115830", "delta": "0:00:00.005043", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204638.14371: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204638.14376: _low_level_execute_command(): starting 41175 1727204638.14379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204637.4996724-41393-92643068193263/ > /dev/null 2>&1 && sleep 0' 41175 1727204638.14852: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204638.14856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204638.14858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204638.14861: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204638.14863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.14915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204638.14921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204638.14964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.17657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204638.17703: stderr chunk (state=3): >>><<< 41175 1727204638.17707: stdout chunk (state=3): >>><<< 41175 1727204638.17723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204638.17735: handler run complete 41175 1727204638.17758: Evaluated conditional (False): False 41175 1727204638.17783: attempt loop complete, returning result 41175 1727204638.17787: _execute() done 41175 1727204638.17793: dumping result to json 41175 1727204638.17797: done dumping result, returning 41175 1727204638.17806: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [12b410aa-8751-f070-39c4-00000000024e] 41175 1727204638.17812: sending task result for task 12b410aa-8751-f070-39c4-00000000024e 41175 1727204638.17927: done sending task result for task 12b410aa-8751-f070-39c4-00000000024e 41175 1727204638.17930: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005043", "end": "2024-09-24 15:03:58.115830", "rc": 0, "start": "2024-09-24 15:03:58.110787" } STDOUT: bonding_masters eth0 lo 41175 1727204638.18026: no more pending results, returning what we have 41175 1727204638.18030: results queue empty 41175 1727204638.18031: checking for any_errors_fatal 41175 1727204638.18033: done checking for any_errors_fatal 41175 1727204638.18034: checking for max_fail_percentage 41175 1727204638.18035: done checking for max_fail_percentage 41175 1727204638.18036: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.18037: done checking to see if all hosts have failed 41175 1727204638.18038: getting the remaining hosts for this loop 41175 1727204638.18040: done getting the remaining hosts for this loop 41175 1727204638.18045: getting the next task for host managed-node3 41175 1727204638.18052: done getting next task for host managed-node3 41175 1727204638.18065: ^ task is: TASK: Set current_interfaces 41175 1727204638.18069: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.18073: getting variables 41175 1727204638.18075: in VariableManager get_vars() 41175 1727204638.18121: Calling all_inventory to load vars for managed-node3 41175 1727204638.18124: Calling groups_inventory to load vars for managed-node3 41175 1727204638.18127: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.18138: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.18141: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.18145: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.18364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.18562: done with get_vars() 41175 1727204638.18573: done getting variables 41175 1727204638.18631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.749) 0:00:05.325 ***** 41175 1727204638.18656: entering _queue_task() for managed-node3/set_fact 41175 1727204638.18874: worker is 1 (out of 1 available) 41175 1727204638.18892: exiting _queue_task() for managed-node3/set_fact 41175 1727204638.18904: done queuing things up, now waiting for results queue to drain 41175 1727204638.18907: waiting for pending results... 41175 1727204638.19067: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 41175 1727204638.19156: in run() - task 12b410aa-8751-f070-39c4-00000000024f 41175 1727204638.19171: variable 'ansible_search_path' from source: unknown 41175 1727204638.19175: variable 'ansible_search_path' from source: unknown 41175 1727204638.19208: calling self._execute() 41175 1727204638.19307: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.19319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.19322: variable 'omit' from source: magic vars 41175 1727204638.19907: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.19911: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.19931: variable 'omit' from source: magic vars 41175 1727204638.19937: variable 'omit' from source: magic vars 41175 1727204638.20355: variable '_current_interfaces' from source: set_fact 41175 1727204638.20414: variable 'omit' from source: magic vars 41175 1727204638.20455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204638.20710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204638.20734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204638.20757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.20797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.20805: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204638.20999: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.21003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.21143: Set connection var ansible_shell_executable to /bin/sh 41175 1727204638.21395: Set connection var ansible_shell_type to sh 41175 1727204638.21399: Set connection var ansible_pipelining to False 41175 1727204638.21401: Set connection var ansible_timeout to 10 41175 1727204638.21404: Set connection var ansible_connection to ssh 41175 1727204638.21406: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204638.21408: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.21410: variable 'ansible_connection' from source: unknown 41175 1727204638.21412: variable 'ansible_module_compression' from source: unknown 41175 1727204638.21414: variable 'ansible_shell_type' from source: unknown 41175 1727204638.21416: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.21418: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.21420: variable 'ansible_pipelining' from source: unknown 41175 1727204638.21423: variable 'ansible_timeout' from source: unknown 41175 1727204638.21425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.21694: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204638.21896: variable 'omit' from source: magic vars 41175 1727204638.21899: starting attempt loop 41175 1727204638.21902: running the handler 41175 1727204638.21904: handler run complete 41175 1727204638.21907: attempt loop complete, returning result 41175 1727204638.21909: _execute() done 41175 1727204638.21911: dumping result to json 41175 1727204638.21914: done dumping result, returning 41175 1727204638.21917: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [12b410aa-8751-f070-39c4-00000000024f] 41175 1727204638.21919: sending task result for task 12b410aa-8751-f070-39c4-00000000024f ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 41175 1727204638.22095: no more pending results, returning what we have 41175 1727204638.22099: results queue empty 41175 1727204638.22100: checking for any_errors_fatal 41175 1727204638.22109: done checking for any_errors_fatal 41175 1727204638.22110: checking for max_fail_percentage 41175 1727204638.22112: done checking for max_fail_percentage 41175 1727204638.22113: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.22114: done checking to see if all hosts have failed 41175 1727204638.22115: getting the remaining hosts for this loop 41175 1727204638.22116: done getting the remaining hosts for this loop 41175 1727204638.22122: getting the next task for host managed-node3 41175 1727204638.22132: done getting next task for host managed-node3 41175 1727204638.22136: ^ task is: TASK: Show current_interfaces 41175 1727204638.22139: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.22144: getting variables 41175 1727204638.22146: in VariableManager get_vars() 41175 1727204638.22398: Calling all_inventory to load vars for managed-node3 41175 1727204638.22402: Calling groups_inventory to load vars for managed-node3 41175 1727204638.22406: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.22419: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.22423: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.22428: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.22784: done sending task result for task 12b410aa-8751-f070-39c4-00000000024f 41175 1727204638.22788: WORKER PROCESS EXITING 41175 1727204638.22818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.23805: done with get_vars() 41175 1727204638.23819: done getting variables 41175 1727204638.23929: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.053) 0:00:05.378 ***** 41175 1727204638.23964: entering _queue_task() for managed-node3/debug 41175 1727204638.23967: Creating lock for debug 41175 1727204638.24288: worker is 1 (out of 1 available) 41175 1727204638.24305: exiting _queue_task() for managed-node3/debug 41175 1727204638.24318: done queuing things up, now waiting for results queue to drain 41175 1727204638.24320: waiting for pending results... 41175 1727204638.24608: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 41175 1727204638.24751: in run() - task 12b410aa-8751-f070-39c4-00000000016a 41175 1727204638.24778: variable 'ansible_search_path' from source: unknown 41175 1727204638.24788: variable 'ansible_search_path' from source: unknown 41175 1727204638.24841: calling self._execute() 41175 1727204638.24953: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.24970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.24988: variable 'omit' from source: magic vars 41175 1727204638.25607: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.25632: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.25647: variable 'omit' from source: magic vars 41175 1727204638.25704: variable 'omit' from source: magic vars 41175 1727204638.25844: variable 'current_interfaces' from source: set_fact 41175 1727204638.25885: variable 'omit' from source: magic vars 41175 1727204638.25944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204638.25998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204638.26028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204638.26066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.26085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.26129: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204638.26140: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.26151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.26297: Set connection var ansible_shell_executable to /bin/sh 41175 1727204638.26306: Set connection var ansible_shell_type to sh 41175 1727204638.26318: Set connection var ansible_pipelining to False 41175 1727204638.26332: Set connection var ansible_timeout to 10 41175 1727204638.26342: Set connection var ansible_connection to ssh 41175 1727204638.26354: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204638.26387: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.26400: variable 'ansible_connection' from source: unknown 41175 1727204638.26409: variable 'ansible_module_compression' from source: unknown 41175 1727204638.26417: variable 'ansible_shell_type' from source: unknown 41175 1727204638.26425: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.26433: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.26443: variable 'ansible_pipelining' from source: unknown 41175 1727204638.26450: variable 'ansible_timeout' from source: unknown 41175 1727204638.26459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.26641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204638.26675: variable 'omit' from source: magic vars 41175 1727204638.26691: starting attempt loop 41175 1727204638.26703: running the handler 41175 1727204638.26767: handler run complete 41175 1727204638.26795: attempt loop complete, returning result 41175 1727204638.26804: _execute() done 41175 1727204638.26816: dumping result to json 41175 1727204638.26830: done dumping result, returning 41175 1727204638.26845: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [12b410aa-8751-f070-39c4-00000000016a] 41175 1727204638.26858: sending task result for task 12b410aa-8751-f070-39c4-00000000016a 41175 1727204638.27196: done sending task result for task 12b410aa-8751-f070-39c4-00000000016a 41175 1727204638.27200: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 41175 1727204638.27246: no more pending results, returning what we have 41175 1727204638.27250: results queue empty 41175 1727204638.27251: checking for any_errors_fatal 41175 1727204638.27257: done checking for any_errors_fatal 41175 1727204638.27258: checking for max_fail_percentage 41175 1727204638.27260: done checking for max_fail_percentage 41175 1727204638.27261: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.27262: done checking to see if all hosts have failed 41175 1727204638.27263: getting the remaining hosts for this loop 41175 1727204638.27264: done getting the remaining hosts for this loop 41175 1727204638.27269: getting the next task for host managed-node3 41175 1727204638.27276: done getting next task for host managed-node3 41175 1727204638.27280: ^ task is: TASK: Include the task 'manage_test_interface.yml' 41175 1727204638.27282: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.27286: getting variables 41175 1727204638.27288: in VariableManager get_vars() 41175 1727204638.27328: Calling all_inventory to load vars for managed-node3 41175 1727204638.27331: Calling groups_inventory to load vars for managed-node3 41175 1727204638.27334: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.27346: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.27350: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.27354: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.27698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.28052: done with get_vars() 41175 1727204638.28065: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:17 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.041) 0:00:05.420 ***** 41175 1727204638.28143: entering _queue_task() for managed-node3/include_tasks 41175 1727204638.28343: worker is 1 (out of 1 available) 41175 1727204638.28357: exiting _queue_task() for managed-node3/include_tasks 41175 1727204638.28369: done queuing things up, now waiting for results queue to drain 41175 1727204638.28371: waiting for pending results... 41175 1727204638.28546: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 41175 1727204638.28609: in run() - task 12b410aa-8751-f070-39c4-00000000000d 41175 1727204638.28625: variable 'ansible_search_path' from source: unknown 41175 1727204638.28655: calling self._execute() 41175 1727204638.28737: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.28743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.28754: variable 'omit' from source: magic vars 41175 1727204638.29075: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.29086: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.29094: _execute() done 41175 1727204638.29097: dumping result to json 41175 1727204638.29103: done dumping result, returning 41175 1727204638.29110: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [12b410aa-8751-f070-39c4-00000000000d] 41175 1727204638.29116: sending task result for task 12b410aa-8751-f070-39c4-00000000000d 41175 1727204638.29223: done sending task result for task 12b410aa-8751-f070-39c4-00000000000d 41175 1727204638.29227: WORKER PROCESS EXITING 41175 1727204638.29270: no more pending results, returning what we have 41175 1727204638.29275: in VariableManager get_vars() 41175 1727204638.29320: Calling all_inventory to load vars for managed-node3 41175 1727204638.29323: Calling groups_inventory to load vars for managed-node3 41175 1727204638.29326: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.29337: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.29340: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.29344: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.29502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.29682: done with get_vars() 41175 1727204638.29690: variable 'ansible_search_path' from source: unknown 41175 1727204638.29700: we have included files to process 41175 1727204638.29701: generating all_blocks data 41175 1727204638.29702: done generating all_blocks data 41175 1727204638.29706: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41175 1727204638.29706: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41175 1727204638.29708: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41175 1727204638.30151: in VariableManager get_vars() 41175 1727204638.30169: done with get_vars() 41175 1727204638.30353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 41175 1727204638.30823: done processing included file 41175 1727204638.30825: iterating over new_blocks loaded from include file 41175 1727204638.30826: in VariableManager get_vars() 41175 1727204638.30840: done with get_vars() 41175 1727204638.30842: filtering new block on tags 41175 1727204638.30867: done filtering new block on tags 41175 1727204638.30870: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 41175 1727204638.30886: extending task lists for all hosts with included blocks 41175 1727204638.32505: done extending task lists 41175 1727204638.32507: done processing included files 41175 1727204638.32508: results queue empty 41175 1727204638.32508: checking for any_errors_fatal 41175 1727204638.32511: done checking for any_errors_fatal 41175 1727204638.32512: checking for max_fail_percentage 41175 1727204638.32513: done checking for max_fail_percentage 41175 1727204638.32514: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.32514: done checking to see if all hosts have failed 41175 1727204638.32515: getting the remaining hosts for this loop 41175 1727204638.32516: done getting the remaining hosts for this loop 41175 1727204638.32518: getting the next task for host managed-node3 41175 1727204638.32521: done getting next task for host managed-node3 41175 1727204638.32523: ^ task is: TASK: Ensure state in ["present", "absent"] 41175 1727204638.32525: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.32527: getting variables 41175 1727204638.32528: in VariableManager get_vars() 41175 1727204638.32539: Calling all_inventory to load vars for managed-node3 41175 1727204638.32541: Calling groups_inventory to load vars for managed-node3 41175 1727204638.32542: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.32547: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.32549: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.32551: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.32681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.32861: done with get_vars() 41175 1727204638.32869: done getting variables 41175 1727204638.32924: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.048) 0:00:05.468 ***** 41175 1727204638.32949: entering _queue_task() for managed-node3/fail 41175 1727204638.32951: Creating lock for fail 41175 1727204638.33207: worker is 1 (out of 1 available) 41175 1727204638.33221: exiting _queue_task() for managed-node3/fail 41175 1727204638.33235: done queuing things up, now waiting for results queue to drain 41175 1727204638.33237: waiting for pending results... 41175 1727204638.33425: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 41175 1727204638.33495: in run() - task 12b410aa-8751-f070-39c4-00000000026a 41175 1727204638.33512: variable 'ansible_search_path' from source: unknown 41175 1727204638.33516: variable 'ansible_search_path' from source: unknown 41175 1727204638.33549: calling self._execute() 41175 1727204638.33626: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.33633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.33643: variable 'omit' from source: magic vars 41175 1727204638.33975: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.33987: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.34110: variable 'state' from source: include params 41175 1727204638.34115: Evaluated conditional (state not in ["present", "absent"]): False 41175 1727204638.34121: when evaluation is False, skipping this task 41175 1727204638.34124: _execute() done 41175 1727204638.34127: dumping result to json 41175 1727204638.34129: done dumping result, returning 41175 1727204638.34138: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-f070-39c4-00000000026a] 41175 1727204638.34144: sending task result for task 12b410aa-8751-f070-39c4-00000000026a 41175 1727204638.34237: done sending task result for task 12b410aa-8751-f070-39c4-00000000026a 41175 1727204638.34242: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 41175 1727204638.34298: no more pending results, returning what we have 41175 1727204638.34303: results queue empty 41175 1727204638.34304: checking for any_errors_fatal 41175 1727204638.34306: done checking for any_errors_fatal 41175 1727204638.34307: checking for max_fail_percentage 41175 1727204638.34308: done checking for max_fail_percentage 41175 1727204638.34309: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.34310: done checking to see if all hosts have failed 41175 1727204638.34311: getting the remaining hosts for this loop 41175 1727204638.34313: done getting the remaining hosts for this loop 41175 1727204638.34320: getting the next task for host managed-node3 41175 1727204638.34326: done getting next task for host managed-node3 41175 1727204638.34328: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 41175 1727204638.34332: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.34335: getting variables 41175 1727204638.34337: in VariableManager get_vars() 41175 1727204638.34373: Calling all_inventory to load vars for managed-node3 41175 1727204638.34376: Calling groups_inventory to load vars for managed-node3 41175 1727204638.34379: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.34392: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.34395: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.34399: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.34579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.34768: done with get_vars() 41175 1727204638.34776: done getting variables 41175 1727204638.34822: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.018) 0:00:05.487 ***** 41175 1727204638.34844: entering _queue_task() for managed-node3/fail 41175 1727204638.35045: worker is 1 (out of 1 available) 41175 1727204638.35060: exiting _queue_task() for managed-node3/fail 41175 1727204638.35073: done queuing things up, now waiting for results queue to drain 41175 1727204638.35076: waiting for pending results... 41175 1727204638.35236: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 41175 1727204638.35302: in run() - task 12b410aa-8751-f070-39c4-00000000026b 41175 1727204638.35325: variable 'ansible_search_path' from source: unknown 41175 1727204638.35329: variable 'ansible_search_path' from source: unknown 41175 1727204638.35352: calling self._execute() 41175 1727204638.35423: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.35432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.35442: variable 'omit' from source: magic vars 41175 1727204638.35733: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.35746: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.35868: variable 'type' from source: set_fact 41175 1727204638.35874: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 41175 1727204638.35883: when evaluation is False, skipping this task 41175 1727204638.35886: _execute() done 41175 1727204638.35891: dumping result to json 41175 1727204638.35894: done dumping result, returning 41175 1727204638.35899: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-f070-39c4-00000000026b] 41175 1727204638.35906: sending task result for task 12b410aa-8751-f070-39c4-00000000026b 41175 1727204638.36002: done sending task result for task 12b410aa-8751-f070-39c4-00000000026b 41175 1727204638.36005: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 41175 1727204638.36054: no more pending results, returning what we have 41175 1727204638.36058: results queue empty 41175 1727204638.36059: checking for any_errors_fatal 41175 1727204638.36064: done checking for any_errors_fatal 41175 1727204638.36065: checking for max_fail_percentage 41175 1727204638.36067: done checking for max_fail_percentage 41175 1727204638.36068: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.36069: done checking to see if all hosts have failed 41175 1727204638.36069: getting the remaining hosts for this loop 41175 1727204638.36071: done getting the remaining hosts for this loop 41175 1727204638.36074: getting the next task for host managed-node3 41175 1727204638.36080: done getting next task for host managed-node3 41175 1727204638.36082: ^ task is: TASK: Include the task 'show_interfaces.yml' 41175 1727204638.36085: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.36091: getting variables 41175 1727204638.36092: in VariableManager get_vars() 41175 1727204638.36129: Calling all_inventory to load vars for managed-node3 41175 1727204638.36133: Calling groups_inventory to load vars for managed-node3 41175 1727204638.36135: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.36143: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.36145: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.36148: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.36297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.36500: done with get_vars() 41175 1727204638.36508: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.017) 0:00:05.504 ***** 41175 1727204638.36581: entering _queue_task() for managed-node3/include_tasks 41175 1727204638.36779: worker is 1 (out of 1 available) 41175 1727204638.36796: exiting _queue_task() for managed-node3/include_tasks 41175 1727204638.36808: done queuing things up, now waiting for results queue to drain 41175 1727204638.36810: waiting for pending results... 41175 1727204638.36966: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 41175 1727204638.37037: in run() - task 12b410aa-8751-f070-39c4-00000000026c 41175 1727204638.37050: variable 'ansible_search_path' from source: unknown 41175 1727204638.37060: variable 'ansible_search_path' from source: unknown 41175 1727204638.37091: calling self._execute() 41175 1727204638.37170: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.37177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.37188: variable 'omit' from source: magic vars 41175 1727204638.37497: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.37508: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.37515: _execute() done 41175 1727204638.37522: dumping result to json 41175 1727204638.37527: done dumping result, returning 41175 1727204638.37534: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-f070-39c4-00000000026c] 41175 1727204638.37540: sending task result for task 12b410aa-8751-f070-39c4-00000000026c 41175 1727204638.37634: done sending task result for task 12b410aa-8751-f070-39c4-00000000026c 41175 1727204638.37637: WORKER PROCESS EXITING 41175 1727204638.37665: no more pending results, returning what we have 41175 1727204638.37670: in VariableManager get_vars() 41175 1727204638.37714: Calling all_inventory to load vars for managed-node3 41175 1727204638.37717: Calling groups_inventory to load vars for managed-node3 41175 1727204638.37720: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.37731: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.37734: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.37737: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.37902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.38080: done with get_vars() 41175 1727204638.38087: variable 'ansible_search_path' from source: unknown 41175 1727204638.38088: variable 'ansible_search_path' from source: unknown 41175 1727204638.38121: we have included files to process 41175 1727204638.38122: generating all_blocks data 41175 1727204638.38124: done generating all_blocks data 41175 1727204638.38127: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41175 1727204638.38128: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41175 1727204638.38130: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41175 1727204638.38215: in VariableManager get_vars() 41175 1727204638.38236: done with get_vars() 41175 1727204638.38327: done processing included file 41175 1727204638.38329: iterating over new_blocks loaded from include file 41175 1727204638.38330: in VariableManager get_vars() 41175 1727204638.38344: done with get_vars() 41175 1727204638.38345: filtering new block on tags 41175 1727204638.38360: done filtering new block on tags 41175 1727204638.38362: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 41175 1727204638.38365: extending task lists for all hosts with included blocks 41175 1727204638.38692: done extending task lists 41175 1727204638.38693: done processing included files 41175 1727204638.38694: results queue empty 41175 1727204638.38695: checking for any_errors_fatal 41175 1727204638.38698: done checking for any_errors_fatal 41175 1727204638.38698: checking for max_fail_percentage 41175 1727204638.38699: done checking for max_fail_percentage 41175 1727204638.38700: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.38700: done checking to see if all hosts have failed 41175 1727204638.38701: getting the remaining hosts for this loop 41175 1727204638.38702: done getting the remaining hosts for this loop 41175 1727204638.38704: getting the next task for host managed-node3 41175 1727204638.38707: done getting next task for host managed-node3 41175 1727204638.38709: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41175 1727204638.38711: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.38713: getting variables 41175 1727204638.38713: in VariableManager get_vars() 41175 1727204638.38747: Calling all_inventory to load vars for managed-node3 41175 1727204638.38750: Calling groups_inventory to load vars for managed-node3 41175 1727204638.38753: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.38760: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.38762: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.38764: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.38886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.39059: done with get_vars() 41175 1727204638.39067: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.025) 0:00:05.530 ***** 41175 1727204638.39129: entering _queue_task() for managed-node3/include_tasks 41175 1727204638.39323: worker is 1 (out of 1 available) 41175 1727204638.39337: exiting _queue_task() for managed-node3/include_tasks 41175 1727204638.39350: done queuing things up, now waiting for results queue to drain 41175 1727204638.39352: waiting for pending results... 41175 1727204638.39517: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 41175 1727204638.39595: in run() - task 12b410aa-8751-f070-39c4-000000000369 41175 1727204638.39606: variable 'ansible_search_path' from source: unknown 41175 1727204638.39610: variable 'ansible_search_path' from source: unknown 41175 1727204638.39644: calling self._execute() 41175 1727204638.39713: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.39719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.39731: variable 'omit' from source: magic vars 41175 1727204638.40038: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.40049: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.40056: _execute() done 41175 1727204638.40059: dumping result to json 41175 1727204638.40065: done dumping result, returning 41175 1727204638.40071: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-f070-39c4-000000000369] 41175 1727204638.40078: sending task result for task 12b410aa-8751-f070-39c4-000000000369 41175 1727204638.40166: done sending task result for task 12b410aa-8751-f070-39c4-000000000369 41175 1727204638.40169: WORKER PROCESS EXITING 41175 1727204638.40203: no more pending results, returning what we have 41175 1727204638.40207: in VariableManager get_vars() 41175 1727204638.40248: Calling all_inventory to load vars for managed-node3 41175 1727204638.40251: Calling groups_inventory to load vars for managed-node3 41175 1727204638.40253: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.40264: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.40267: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.40271: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.40438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.40638: done with get_vars() 41175 1727204638.40644: variable 'ansible_search_path' from source: unknown 41175 1727204638.40645: variable 'ansible_search_path' from source: unknown 41175 1727204638.40688: we have included files to process 41175 1727204638.40691: generating all_blocks data 41175 1727204638.40692: done generating all_blocks data 41175 1727204638.40693: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41175 1727204638.40694: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41175 1727204638.40696: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41175 1727204638.40903: done processing included file 41175 1727204638.40905: iterating over new_blocks loaded from include file 41175 1727204638.40906: in VariableManager get_vars() 41175 1727204638.40922: done with get_vars() 41175 1727204638.40923: filtering new block on tags 41175 1727204638.40937: done filtering new block on tags 41175 1727204638.40939: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 41175 1727204638.40943: extending task lists for all hosts with included blocks 41175 1727204638.41064: done extending task lists 41175 1727204638.41066: done processing included files 41175 1727204638.41066: results queue empty 41175 1727204638.41067: checking for any_errors_fatal 41175 1727204638.41069: done checking for any_errors_fatal 41175 1727204638.41069: checking for max_fail_percentage 41175 1727204638.41070: done checking for max_fail_percentage 41175 1727204638.41071: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.41071: done checking to see if all hosts have failed 41175 1727204638.41072: getting the remaining hosts for this loop 41175 1727204638.41073: done getting the remaining hosts for this loop 41175 1727204638.41075: getting the next task for host managed-node3 41175 1727204638.41078: done getting next task for host managed-node3 41175 1727204638.41080: ^ task is: TASK: Gather current interface info 41175 1727204638.41083: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.41084: getting variables 41175 1727204638.41085: in VariableManager get_vars() 41175 1727204638.41097: Calling all_inventory to load vars for managed-node3 41175 1727204638.41098: Calling groups_inventory to load vars for managed-node3 41175 1727204638.41100: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.41104: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.41106: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.41108: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.41235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.41415: done with get_vars() 41175 1727204638.41423: done getting variables 41175 1727204638.41454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.023) 0:00:05.553 ***** 41175 1727204638.41476: entering _queue_task() for managed-node3/command 41175 1727204638.41666: worker is 1 (out of 1 available) 41175 1727204638.41680: exiting _queue_task() for managed-node3/command 41175 1727204638.41695: done queuing things up, now waiting for results queue to drain 41175 1727204638.41697: waiting for pending results... 41175 1727204638.41858: running TaskExecutor() for managed-node3/TASK: Gather current interface info 41175 1727204638.41941: in run() - task 12b410aa-8751-f070-39c4-0000000003a0 41175 1727204638.41954: variable 'ansible_search_path' from source: unknown 41175 1727204638.41958: variable 'ansible_search_path' from source: unknown 41175 1727204638.41986: calling self._execute() 41175 1727204638.42058: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.42065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.42075: variable 'omit' from source: magic vars 41175 1727204638.42419: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.42431: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.42438: variable 'omit' from source: magic vars 41175 1727204638.42486: variable 'omit' from source: magic vars 41175 1727204638.42515: variable 'omit' from source: magic vars 41175 1727204638.42552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204638.42585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204638.42605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204638.42623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.42635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.42660: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204638.42664: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.42668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.42758: Set connection var ansible_shell_executable to /bin/sh 41175 1727204638.42762: Set connection var ansible_shell_type to sh 41175 1727204638.42768: Set connection var ansible_pipelining to False 41175 1727204638.42777: Set connection var ansible_timeout to 10 41175 1727204638.42783: Set connection var ansible_connection to ssh 41175 1727204638.42792: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204638.42814: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.42818: variable 'ansible_connection' from source: unknown 41175 1727204638.42823: variable 'ansible_module_compression' from source: unknown 41175 1727204638.42826: variable 'ansible_shell_type' from source: unknown 41175 1727204638.42831: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.42835: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.42840: variable 'ansible_pipelining' from source: unknown 41175 1727204638.42844: variable 'ansible_timeout' from source: unknown 41175 1727204638.42849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.42969: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204638.42978: variable 'omit' from source: magic vars 41175 1727204638.42985: starting attempt loop 41175 1727204638.42988: running the handler 41175 1727204638.43004: _low_level_execute_command(): starting 41175 1727204638.43013: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204638.43564: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.43568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.43572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204638.43575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.43642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204638.43644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204638.43646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204638.43685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.46063: stdout chunk (state=3): >>>/root <<< 41175 1727204638.46228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204638.46283: stderr chunk (state=3): >>><<< 41175 1727204638.46287: stdout chunk (state=3): >>><<< 41175 1727204638.46310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204638.46323: _low_level_execute_command(): starting 41175 1727204638.46330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114 `" && echo ansible-tmp-1727204638.4630976-41428-234975936909114="` echo /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114 `" ) && sleep 0' 41175 1727204638.46810: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.46814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.46817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.46825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.46874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204638.46877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204638.46926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.50076: stdout chunk (state=3): >>>ansible-tmp-1727204638.4630976-41428-234975936909114=/root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114 <<< 41175 1727204638.50331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204638.50394: stderr chunk (state=3): >>><<< 41175 1727204638.50398: stdout chunk (state=3): >>><<< 41175 1727204638.50406: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204638.4630976-41428-234975936909114=/root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204638.50435: variable 'ansible_module_compression' from source: unknown 41175 1727204638.50476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204638.50510: variable 'ansible_facts' from source: unknown 41175 1727204638.50572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/AnsiballZ_command.py 41175 1727204638.50686: Sending initial data 41175 1727204638.50692: Sent initial data (156 bytes) 41175 1727204638.51170: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204638.51173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.51176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.51178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.51242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204638.51245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204638.51246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204638.51290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.54036: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204638.54072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204638.54117: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpgmfmo9gy /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/AnsiballZ_command.py <<< 41175 1727204638.54124: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/AnsiballZ_command.py" <<< 41175 1727204638.54159: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpgmfmo9gy" to remote "/root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/AnsiballZ_command.py" <<< 41175 1727204638.54167: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/AnsiballZ_command.py" <<< 41175 1727204638.54973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204638.55034: stderr chunk (state=3): >>><<< 41175 1727204638.55037: stdout chunk (state=3): >>><<< 41175 1727204638.55063: done transferring module to remote 41175 1727204638.55070: _low_level_execute_command(): starting 41175 1727204638.55078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/ /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/AnsiballZ_command.py && sleep 0' 41175 1727204638.55535: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.55538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.55541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.55543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.55606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204638.55613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204638.55645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.58280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204638.58334: stderr chunk (state=3): >>><<< 41175 1727204638.58337: stdout chunk (state=3): >>><<< 41175 1727204638.58349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204638.58352: _low_level_execute_command(): starting 41175 1727204638.58358: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/AnsiballZ_command.py && sleep 0' 41175 1727204638.58796: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.58800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.58802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204638.58804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.58862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204638.58865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204638.58912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.86698: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:58.859973", "end": "2024-09-24 15:03:58.865345", "delta": "0:00:00.005372", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204638.89229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204638.89285: stderr chunk (state=3): >>><<< 41175 1727204638.89302: stdout chunk (state=3): >>><<< 41175 1727204638.89343: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:58.859973", "end": "2024-09-24 15:03:58.865345", "delta": "0:00:00.005372", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204638.89404: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204638.89444: _low_level_execute_command(): starting 41175 1727204638.89447: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204638.4630976-41428-234975936909114/ > /dev/null 2>&1 && sleep 0' 41175 1727204638.90160: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204638.90282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204638.90303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204638.90330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204638.90356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204638.90444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204638.93180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204638.93195: stdout chunk (state=3): >>><<< 41175 1727204638.93208: stderr chunk (state=3): >>><<< 41175 1727204638.93235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204638.93250: handler run complete 41175 1727204638.93397: Evaluated conditional (False): False 41175 1727204638.93401: attempt loop complete, returning result 41175 1727204638.93403: _execute() done 41175 1727204638.93405: dumping result to json 41175 1727204638.93408: done dumping result, returning 41175 1727204638.93410: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [12b410aa-8751-f070-39c4-0000000003a0] 41175 1727204638.93412: sending task result for task 12b410aa-8751-f070-39c4-0000000003a0 41175 1727204638.93502: done sending task result for task 12b410aa-8751-f070-39c4-0000000003a0 ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005372", "end": "2024-09-24 15:03:58.865345", "rc": 0, "start": "2024-09-24 15:03:58.859973" } STDOUT: bonding_masters eth0 lo 41175 1727204638.93613: no more pending results, returning what we have 41175 1727204638.93620: results queue empty 41175 1727204638.93622: checking for any_errors_fatal 41175 1727204638.93625: done checking for any_errors_fatal 41175 1727204638.93626: checking for max_fail_percentage 41175 1727204638.93628: done checking for max_fail_percentage 41175 1727204638.93629: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.93630: done checking to see if all hosts have failed 41175 1727204638.93631: getting the remaining hosts for this loop 41175 1727204638.93633: done getting the remaining hosts for this loop 41175 1727204638.93640: getting the next task for host managed-node3 41175 1727204638.93649: done getting next task for host managed-node3 41175 1727204638.93653: ^ task is: TASK: Set current_interfaces 41175 1727204638.93659: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.93664: getting variables 41175 1727204638.93667: in VariableManager get_vars() 41175 1727204638.94029: Calling all_inventory to load vars for managed-node3 41175 1727204638.94038: Calling groups_inventory to load vars for managed-node3 41175 1727204638.94041: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.94049: WORKER PROCESS EXITING 41175 1727204638.94062: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.94066: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.94070: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.94511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.94879: done with get_vars() 41175 1727204638.94899: done getting variables 41175 1727204638.94981: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.535) 0:00:06.089 ***** 41175 1727204638.95038: entering _queue_task() for managed-node3/set_fact 41175 1727204638.95383: worker is 1 (out of 1 available) 41175 1727204638.95398: exiting _queue_task() for managed-node3/set_fact 41175 1727204638.95410: done queuing things up, now waiting for results queue to drain 41175 1727204638.95411: waiting for pending results... 41175 1727204638.95729: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 41175 1727204638.95898: in run() - task 12b410aa-8751-f070-39c4-0000000003a1 41175 1727204638.95922: variable 'ansible_search_path' from source: unknown 41175 1727204638.95931: variable 'ansible_search_path' from source: unknown 41175 1727204638.95970: calling self._execute() 41175 1727204638.96076: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.96096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.96225: variable 'omit' from source: magic vars 41175 1727204638.96630: variable 'ansible_distribution_major_version' from source: facts 41175 1727204638.96668: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204638.96676: variable 'omit' from source: magic vars 41175 1727204638.96766: variable 'omit' from source: magic vars 41175 1727204638.96986: variable '_current_interfaces' from source: set_fact 41175 1727204638.97026: variable 'omit' from source: magic vars 41175 1727204638.97072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204638.97144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204638.97175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204638.97219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.97243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204638.97283: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204638.97307: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.97310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.97472: Set connection var ansible_shell_executable to /bin/sh 41175 1727204638.97481: Set connection var ansible_shell_type to sh 41175 1727204638.97526: Set connection var ansible_pipelining to False 41175 1727204638.97536: Set connection var ansible_timeout to 10 41175 1727204638.97540: Set connection var ansible_connection to ssh 41175 1727204638.97547: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204638.97586: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.97635: variable 'ansible_connection' from source: unknown 41175 1727204638.97648: variable 'ansible_module_compression' from source: unknown 41175 1727204638.97652: variable 'ansible_shell_type' from source: unknown 41175 1727204638.97655: variable 'ansible_shell_executable' from source: unknown 41175 1727204638.97657: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204638.97663: variable 'ansible_pipelining' from source: unknown 41175 1727204638.97667: variable 'ansible_timeout' from source: unknown 41175 1727204638.97669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204638.97866: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204638.97960: variable 'omit' from source: magic vars 41175 1727204638.97964: starting attempt loop 41175 1727204638.97968: running the handler 41175 1727204638.97971: handler run complete 41175 1727204638.97973: attempt loop complete, returning result 41175 1727204638.97975: _execute() done 41175 1727204638.97977: dumping result to json 41175 1727204638.97979: done dumping result, returning 41175 1727204638.97982: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [12b410aa-8751-f070-39c4-0000000003a1] 41175 1727204638.97986: sending task result for task 12b410aa-8751-f070-39c4-0000000003a1 41175 1727204638.98174: done sending task result for task 12b410aa-8751-f070-39c4-0000000003a1 41175 1727204638.98178: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 41175 1727204638.98266: no more pending results, returning what we have 41175 1727204638.98271: results queue empty 41175 1727204638.98273: checking for any_errors_fatal 41175 1727204638.98280: done checking for any_errors_fatal 41175 1727204638.98281: checking for max_fail_percentage 41175 1727204638.98283: done checking for max_fail_percentage 41175 1727204638.98285: checking to see if all hosts have failed and the running result is not ok 41175 1727204638.98286: done checking to see if all hosts have failed 41175 1727204638.98287: getting the remaining hosts for this loop 41175 1727204638.98291: done getting the remaining hosts for this loop 41175 1727204638.98297: getting the next task for host managed-node3 41175 1727204638.98309: done getting next task for host managed-node3 41175 1727204638.98312: ^ task is: TASK: Show current_interfaces 41175 1727204638.98321: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204638.98326: getting variables 41175 1727204638.98328: in VariableManager get_vars() 41175 1727204638.98377: Calling all_inventory to load vars for managed-node3 41175 1727204638.98380: Calling groups_inventory to load vars for managed-node3 41175 1727204638.98384: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204638.98610: Calling all_plugins_play to load vars for managed-node3 41175 1727204638.98615: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204638.98623: Calling groups_plugins_play to load vars for managed-node3 41175 1727204638.98898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204638.99561: done with get_vars() 41175 1727204638.99573: done getting variables 41175 1727204638.99649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.046) 0:00:06.135 ***** 41175 1727204638.99683: entering _queue_task() for managed-node3/debug 41175 1727204638.99984: worker is 1 (out of 1 available) 41175 1727204639.00102: exiting _queue_task() for managed-node3/debug 41175 1727204639.00113: done queuing things up, now waiting for results queue to drain 41175 1727204639.00115: waiting for pending results... 41175 1727204639.00476: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 41175 1727204639.00521: in run() - task 12b410aa-8751-f070-39c4-00000000036a 41175 1727204639.00543: variable 'ansible_search_path' from source: unknown 41175 1727204639.00552: variable 'ansible_search_path' from source: unknown 41175 1727204639.00605: calling self._execute() 41175 1727204639.00789: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204639.00795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204639.00798: variable 'omit' from source: magic vars 41175 1727204639.01255: variable 'ansible_distribution_major_version' from source: facts 41175 1727204639.01335: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204639.01338: variable 'omit' from source: magic vars 41175 1727204639.01361: variable 'omit' from source: magic vars 41175 1727204639.01504: variable 'current_interfaces' from source: set_fact 41175 1727204639.01550: variable 'omit' from source: magic vars 41175 1727204639.01609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204639.01663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204639.01699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204639.01733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204639.01768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204639.01802: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204639.01876: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204639.01881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204639.01974: Set connection var ansible_shell_executable to /bin/sh 41175 1727204639.01988: Set connection var ansible_shell_type to sh 41175 1727204639.02005: Set connection var ansible_pipelining to False 41175 1727204639.02023: Set connection var ansible_timeout to 10 41175 1727204639.02093: Set connection var ansible_connection to ssh 41175 1727204639.02098: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204639.02101: variable 'ansible_shell_executable' from source: unknown 41175 1727204639.02104: variable 'ansible_connection' from source: unknown 41175 1727204639.02107: variable 'ansible_module_compression' from source: unknown 41175 1727204639.02109: variable 'ansible_shell_type' from source: unknown 41175 1727204639.02112: variable 'ansible_shell_executable' from source: unknown 41175 1727204639.02142: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204639.02145: variable 'ansible_pipelining' from source: unknown 41175 1727204639.02148: variable 'ansible_timeout' from source: unknown 41175 1727204639.02195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204639.02351: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204639.02375: variable 'omit' from source: magic vars 41175 1727204639.02386: starting attempt loop 41175 1727204639.02395: running the handler 41175 1727204639.02467: handler run complete 41175 1727204639.02495: attempt loop complete, returning result 41175 1727204639.02534: _execute() done 41175 1727204639.02537: dumping result to json 41175 1727204639.02540: done dumping result, returning 41175 1727204639.02543: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [12b410aa-8751-f070-39c4-00000000036a] 41175 1727204639.02547: sending task result for task 12b410aa-8751-f070-39c4-00000000036a ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 41175 1727204639.02731: no more pending results, returning what we have 41175 1727204639.02736: results queue empty 41175 1727204639.02738: checking for any_errors_fatal 41175 1727204639.02747: done checking for any_errors_fatal 41175 1727204639.02749: checking for max_fail_percentage 41175 1727204639.02751: done checking for max_fail_percentage 41175 1727204639.02752: checking to see if all hosts have failed and the running result is not ok 41175 1727204639.02754: done checking to see if all hosts have failed 41175 1727204639.02755: getting the remaining hosts for this loop 41175 1727204639.02757: done getting the remaining hosts for this loop 41175 1727204639.02762: getting the next task for host managed-node3 41175 1727204639.02771: done getting next task for host managed-node3 41175 1727204639.02777: ^ task is: TASK: Install iproute 41175 1727204639.02781: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204639.02786: getting variables 41175 1727204639.02789: in VariableManager get_vars() 41175 1727204639.02840: Calling all_inventory to load vars for managed-node3 41175 1727204639.02844: Calling groups_inventory to load vars for managed-node3 41175 1727204639.02847: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204639.02862: Calling all_plugins_play to load vars for managed-node3 41175 1727204639.02866: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204639.02870: Calling groups_plugins_play to load vars for managed-node3 41175 1727204639.03436: done sending task result for task 12b410aa-8751-f070-39c4-00000000036a 41175 1727204639.03440: WORKER PROCESS EXITING 41175 1727204639.03468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204639.03829: done with get_vars() 41175 1727204639.03842: done getting variables 41175 1727204639.03920: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.042) 0:00:06.178 ***** 41175 1727204639.03957: entering _queue_task() for managed-node3/package 41175 1727204639.04267: worker is 1 (out of 1 available) 41175 1727204639.04282: exiting _queue_task() for managed-node3/package 41175 1727204639.04415: done queuing things up, now waiting for results queue to drain 41175 1727204639.04420: waiting for pending results... 41175 1727204639.04627: running TaskExecutor() for managed-node3/TASK: Install iproute 41175 1727204639.04771: in run() - task 12b410aa-8751-f070-39c4-00000000026d 41175 1727204639.04794: variable 'ansible_search_path' from source: unknown 41175 1727204639.04803: variable 'ansible_search_path' from source: unknown 41175 1727204639.04860: calling self._execute() 41175 1727204639.04971: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204639.04987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204639.05061: variable 'omit' from source: magic vars 41175 1727204639.05514: variable 'ansible_distribution_major_version' from source: facts 41175 1727204639.05539: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204639.05553: variable 'omit' from source: magic vars 41175 1727204639.05615: variable 'omit' from source: magic vars 41175 1727204639.05901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204639.08760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204639.08848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204639.08969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204639.08972: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204639.09003: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204639.09199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204639.09208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204639.09234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204639.09291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204639.09331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204639.09472: variable '__network_is_ostree' from source: set_fact 41175 1727204639.09484: variable 'omit' from source: magic vars 41175 1727204639.09635: variable 'omit' from source: magic vars 41175 1727204639.09638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204639.09643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204639.09646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204639.09673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204639.09692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204639.09736: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204639.09766: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204639.09769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204639.09914: Set connection var ansible_shell_executable to /bin/sh 41175 1727204639.09962: Set connection var ansible_shell_type to sh 41175 1727204639.09965: Set connection var ansible_pipelining to False 41175 1727204639.09968: Set connection var ansible_timeout to 10 41175 1727204639.09971: Set connection var ansible_connection to ssh 41175 1727204639.09987: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204639.10025: variable 'ansible_shell_executable' from source: unknown 41175 1727204639.10035: variable 'ansible_connection' from source: unknown 41175 1727204639.10071: variable 'ansible_module_compression' from source: unknown 41175 1727204639.10074: variable 'ansible_shell_type' from source: unknown 41175 1727204639.10077: variable 'ansible_shell_executable' from source: unknown 41175 1727204639.10079: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204639.10090: variable 'ansible_pipelining' from source: unknown 41175 1727204639.10095: variable 'ansible_timeout' from source: unknown 41175 1727204639.10101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204639.10291: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204639.10301: variable 'omit' from source: magic vars 41175 1727204639.10303: starting attempt loop 41175 1727204639.10311: running the handler 41175 1727204639.10313: variable 'ansible_facts' from source: unknown 41175 1727204639.10315: variable 'ansible_facts' from source: unknown 41175 1727204639.10357: _low_level_execute_command(): starting 41175 1727204639.10371: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204639.11106: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204639.11139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.11142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204639.11145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204639.11147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.11207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204639.11224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204639.11234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204639.11267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204639.13707: stdout chunk (state=3): >>>/root <<< 41175 1727204639.13875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204639.13934: stderr chunk (state=3): >>><<< 41175 1727204639.13938: stdout chunk (state=3): >>><<< 41175 1727204639.13963: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204639.13974: _low_level_execute_command(): starting 41175 1727204639.13982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750 `" && echo ansible-tmp-1727204639.1396322-41443-173311626981750="` echo /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750 `" ) && sleep 0' 41175 1727204639.14430: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204639.14433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204639.14436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204639.14439: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.14484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204639.14502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204639.14551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204639.17471: stdout chunk (state=3): >>>ansible-tmp-1727204639.1396322-41443-173311626981750=/root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750 <<< 41175 1727204639.17658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204639.17714: stderr chunk (state=3): >>><<< 41175 1727204639.17719: stdout chunk (state=3): >>><<< 41175 1727204639.17796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204639.1396322-41443-173311626981750=/root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204639.17800: variable 'ansible_module_compression' from source: unknown 41175 1727204639.17819: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 41175 1727204639.17823: ANSIBALLZ: Acquiring lock 41175 1727204639.17828: ANSIBALLZ: Lock acquired: 140088839296144 41175 1727204639.17833: ANSIBALLZ: Creating module 41175 1727204639.34879: ANSIBALLZ: Writing module into payload 41175 1727204639.35075: ANSIBALLZ: Writing module 41175 1727204639.35098: ANSIBALLZ: Renaming module 41175 1727204639.35103: ANSIBALLZ: Done creating module 41175 1727204639.35126: variable 'ansible_facts' from source: unknown 41175 1727204639.35186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/AnsiballZ_dnf.py 41175 1727204639.35308: Sending initial data 41175 1727204639.35311: Sent initial data (152 bytes) 41175 1727204639.35760: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204639.35814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204639.35821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.35824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204639.35826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.35867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204639.35870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204639.35874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204639.35926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204639.38299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204639.38347: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204639.38386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpuinan6ob /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/AnsiballZ_dnf.py <<< 41175 1727204639.38398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/AnsiballZ_dnf.py" <<< 41175 1727204639.38430: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpuinan6ob" to remote "/root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/AnsiballZ_dnf.py" <<< 41175 1727204639.39484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204639.39547: stderr chunk (state=3): >>><<< 41175 1727204639.39551: stdout chunk (state=3): >>><<< 41175 1727204639.39570: done transferring module to remote 41175 1727204639.39583: _low_level_execute_command(): starting 41175 1727204639.39588: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/ /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/AnsiballZ_dnf.py && sleep 0' 41175 1727204639.40068: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204639.40071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204639.40074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.40077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204639.40079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.40128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204639.40131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204639.40180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204639.42799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204639.42845: stderr chunk (state=3): >>><<< 41175 1727204639.42849: stdout chunk (state=3): >>><<< 41175 1727204639.42864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204639.42867: _low_level_execute_command(): starting 41175 1727204639.42875: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/AnsiballZ_dnf.py && sleep 0' 41175 1727204639.43337: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204639.43340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.43343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204639.43345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204639.43385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204639.43395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204639.43459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204641.56540: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 41175 1727204641.62163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204641.62223: stderr chunk (state=3): >>><<< 41175 1727204641.62227: stdout chunk (state=3): >>><<< 41175 1727204641.62243: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204641.62297: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204641.62304: _low_level_execute_command(): starting 41175 1727204641.62310: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204639.1396322-41443-173311626981750/ > /dev/null 2>&1 && sleep 0' 41175 1727204641.62795: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204641.62799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.62802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204641.62817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204641.62840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.62887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204641.62897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204641.62899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204641.62938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204641.65096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204641.65100: stdout chunk (state=3): >>><<< 41175 1727204641.65102: stderr chunk (state=3): >>><<< 41175 1727204641.65105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204641.65107: handler run complete 41175 1727204641.65195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204641.65405: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204641.65454: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204641.65495: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204641.65530: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204641.65616: variable '__install_status' from source: unknown 41175 1727204641.65644: Evaluated conditional (__install_status is success): True 41175 1727204641.65670: attempt loop complete, returning result 41175 1727204641.65677: _execute() done 41175 1727204641.65684: dumping result to json 41175 1727204641.65696: done dumping result, returning 41175 1727204641.65708: done running TaskExecutor() for managed-node3/TASK: Install iproute [12b410aa-8751-f070-39c4-00000000026d] 41175 1727204641.65716: sending task result for task 12b410aa-8751-f070-39c4-00000000026d ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 41175 1727204641.65941: no more pending results, returning what we have 41175 1727204641.65945: results queue empty 41175 1727204641.65946: checking for any_errors_fatal 41175 1727204641.65954: done checking for any_errors_fatal 41175 1727204641.65955: checking for max_fail_percentage 41175 1727204641.65956: done checking for max_fail_percentage 41175 1727204641.65957: checking to see if all hosts have failed and the running result is not ok 41175 1727204641.65959: done checking to see if all hosts have failed 41175 1727204641.65960: getting the remaining hosts for this loop 41175 1727204641.65962: done getting the remaining hosts for this loop 41175 1727204641.65966: getting the next task for host managed-node3 41175 1727204641.65974: done getting next task for host managed-node3 41175 1727204641.65977: ^ task is: TASK: Create veth interface {{ interface }} 41175 1727204641.65980: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204641.65984: getting variables 41175 1727204641.65986: in VariableManager get_vars() 41175 1727204641.66343: Calling all_inventory to load vars for managed-node3 41175 1727204641.66347: Calling groups_inventory to load vars for managed-node3 41175 1727204641.66350: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204641.66358: done sending task result for task 12b410aa-8751-f070-39c4-00000000026d 41175 1727204641.66362: WORKER PROCESS EXITING 41175 1727204641.66374: Calling all_plugins_play to load vars for managed-node3 41175 1727204641.66377: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204641.66381: Calling groups_plugins_play to load vars for managed-node3 41175 1727204641.66598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204641.66783: done with get_vars() 41175 1727204641.66795: done getting variables 41175 1727204641.66843: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204641.66951: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:04:01 -0400 (0:00:02.630) 0:00:08.808 ***** 41175 1727204641.66978: entering _queue_task() for managed-node3/command 41175 1727204641.67213: worker is 1 (out of 1 available) 41175 1727204641.67228: exiting _queue_task() for managed-node3/command 41175 1727204641.67240: done queuing things up, now waiting for results queue to drain 41175 1727204641.67242: waiting for pending results... 41175 1727204641.67415: running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 41175 1727204641.67494: in run() - task 12b410aa-8751-f070-39c4-00000000026e 41175 1727204641.67506: variable 'ansible_search_path' from source: unknown 41175 1727204641.67511: variable 'ansible_search_path' from source: unknown 41175 1727204641.67750: variable 'interface' from source: set_fact 41175 1727204641.67826: variable 'interface' from source: set_fact 41175 1727204641.67885: variable 'interface' from source: set_fact 41175 1727204641.68019: Loaded config def from plugin (lookup/items) 41175 1727204641.68028: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 41175 1727204641.68049: variable 'omit' from source: magic vars 41175 1727204641.68159: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204641.68168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204641.68179: variable 'omit' from source: magic vars 41175 1727204641.68381: variable 'ansible_distribution_major_version' from source: facts 41175 1727204641.68388: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204641.68567: variable 'type' from source: set_fact 41175 1727204641.68570: variable 'state' from source: include params 41175 1727204641.68577: variable 'interface' from source: set_fact 41175 1727204641.68580: variable 'current_interfaces' from source: set_fact 41175 1727204641.68588: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41175 1727204641.68596: variable 'omit' from source: magic vars 41175 1727204641.68630: variable 'omit' from source: magic vars 41175 1727204641.68667: variable 'item' from source: unknown 41175 1727204641.68730: variable 'item' from source: unknown 41175 1727204641.68746: variable 'omit' from source: magic vars 41175 1727204641.68774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204641.68805: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204641.68825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204641.68842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204641.68852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204641.68878: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204641.68881: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204641.68888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204641.68980: Set connection var ansible_shell_executable to /bin/sh 41175 1727204641.68984: Set connection var ansible_shell_type to sh 41175 1727204641.68991: Set connection var ansible_pipelining to False 41175 1727204641.69001: Set connection var ansible_timeout to 10 41175 1727204641.69012: Set connection var ansible_connection to ssh 41175 1727204641.69015: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204641.69035: variable 'ansible_shell_executable' from source: unknown 41175 1727204641.69038: variable 'ansible_connection' from source: unknown 41175 1727204641.69041: variable 'ansible_module_compression' from source: unknown 41175 1727204641.69044: variable 'ansible_shell_type' from source: unknown 41175 1727204641.69048: variable 'ansible_shell_executable' from source: unknown 41175 1727204641.69052: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204641.69058: variable 'ansible_pipelining' from source: unknown 41175 1727204641.69062: variable 'ansible_timeout' from source: unknown 41175 1727204641.69067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204641.69186: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204641.69196: variable 'omit' from source: magic vars 41175 1727204641.69202: starting attempt loop 41175 1727204641.69205: running the handler 41175 1727204641.69222: _low_level_execute_command(): starting 41175 1727204641.69233: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204641.69776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204641.69780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.69786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204641.69789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.69841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204641.69844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204641.69901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204641.71707: stdout chunk (state=3): >>>/root <<< 41175 1727204641.71811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204641.71873: stderr chunk (state=3): >>><<< 41175 1727204641.71877: stdout chunk (state=3): >>><<< 41175 1727204641.71903: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204641.71915: _low_level_execute_command(): starting 41175 1727204641.71921: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732 `" && echo ansible-tmp-1727204641.7190297-41594-112375379471732="` echo /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732 `" ) && sleep 0' 41175 1727204641.72411: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204641.72415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.72418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204641.72420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.72475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204641.72478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204641.72527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204641.74587: stdout chunk (state=3): >>>ansible-tmp-1727204641.7190297-41594-112375379471732=/root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732 <<< 41175 1727204641.74708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204641.74764: stderr chunk (state=3): >>><<< 41175 1727204641.74767: stdout chunk (state=3): >>><<< 41175 1727204641.74784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204641.7190297-41594-112375379471732=/root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204641.74823: variable 'ansible_module_compression' from source: unknown 41175 1727204641.74862: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204641.74896: variable 'ansible_facts' from source: unknown 41175 1727204641.74965: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/AnsiballZ_command.py 41175 1727204641.75085: Sending initial data 41175 1727204641.75091: Sent initial data (156 bytes) 41175 1727204641.75571: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204641.75575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204641.75577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204641.75580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204641.75582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.75635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204641.75638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204641.75693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204641.77390: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204641.77423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204641.77458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpbt6jfi21 /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/AnsiballZ_command.py <<< 41175 1727204641.77462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/AnsiballZ_command.py" <<< 41175 1727204641.77492: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpbt6jfi21" to remote "/root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/AnsiballZ_command.py" <<< 41175 1727204641.77499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/AnsiballZ_command.py" <<< 41175 1727204641.78263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204641.78341: stderr chunk (state=3): >>><<< 41175 1727204641.78345: stdout chunk (state=3): >>><<< 41175 1727204641.78367: done transferring module to remote 41175 1727204641.78378: _low_level_execute_command(): starting 41175 1727204641.78384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/ /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/AnsiballZ_command.py && sleep 0' 41175 1727204641.78867: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204641.78871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.78877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204641.78881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.78936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204641.78941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204641.78979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204641.80925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204641.80983: stderr chunk (state=3): >>><<< 41175 1727204641.80986: stdout chunk (state=3): >>><<< 41175 1727204641.81005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204641.81008: _low_level_execute_command(): starting 41175 1727204641.81014: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/AnsiballZ_command.py && sleep 0' 41175 1727204641.81525: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204641.81577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204641.81581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204641.81636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.00140: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:04:01.994201", "end": "2024-09-24 15:04:01.999700", "delta": "0:00:00.005499", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204642.03278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204642.03345: stderr chunk (state=3): >>><<< 41175 1727204642.03349: stdout chunk (state=3): >>><<< 41175 1727204642.03365: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:04:01.994201", "end": "2024-09-24 15:04:01.999700", "delta": "0:00:00.005499", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204642.03409: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204642.03418: _low_level_execute_command(): starting 41175 1727204642.03426: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204641.7190297-41594-112375379471732/ > /dev/null 2>&1 && sleep 0' 41175 1727204642.03895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.03920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204642.03929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.03988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.03995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.03998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.04036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.08663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.08733: stderr chunk (state=3): >>><<< 41175 1727204642.08736: stdout chunk (state=3): >>><<< 41175 1727204642.08752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.08760: handler run complete 41175 1727204642.08782: Evaluated conditional (False): False 41175 1727204642.08798: attempt loop complete, returning result 41175 1727204642.08816: variable 'item' from source: unknown 41175 1727204642.08888: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005499", "end": "2024-09-24 15:04:01.999700", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 15:04:01.994201" } 41175 1727204642.09081: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.09084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.09087: variable 'omit' from source: magic vars 41175 1727204642.09234: variable 'ansible_distribution_major_version' from source: facts 41175 1727204642.09240: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204642.09404: variable 'type' from source: set_fact 41175 1727204642.09407: variable 'state' from source: include params 41175 1727204642.09413: variable 'interface' from source: set_fact 41175 1727204642.09422: variable 'current_interfaces' from source: set_fact 41175 1727204642.09425: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41175 1727204642.09437: variable 'omit' from source: magic vars 41175 1727204642.09447: variable 'omit' from source: magic vars 41175 1727204642.09480: variable 'item' from source: unknown 41175 1727204642.09536: variable 'item' from source: unknown 41175 1727204642.09658: variable 'omit' from source: magic vars 41175 1727204642.09662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204642.09665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204642.09667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204642.09670: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204642.09672: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.09674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.09676: Set connection var ansible_shell_executable to /bin/sh 41175 1727204642.09678: Set connection var ansible_shell_type to sh 41175 1727204642.09681: Set connection var ansible_pipelining to False 41175 1727204642.09683: Set connection var ansible_timeout to 10 41175 1727204642.09692: Set connection var ansible_connection to ssh 41175 1727204642.09699: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204642.09716: variable 'ansible_shell_executable' from source: unknown 41175 1727204642.09721: variable 'ansible_connection' from source: unknown 41175 1727204642.09724: variable 'ansible_module_compression' from source: unknown 41175 1727204642.09726: variable 'ansible_shell_type' from source: unknown 41175 1727204642.09729: variable 'ansible_shell_executable' from source: unknown 41175 1727204642.09734: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.09739: variable 'ansible_pipelining' from source: unknown 41175 1727204642.09743: variable 'ansible_timeout' from source: unknown 41175 1727204642.09748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.09834: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204642.09843: variable 'omit' from source: magic vars 41175 1727204642.09848: starting attempt loop 41175 1727204642.09851: running the handler 41175 1727204642.09860: _low_level_execute_command(): starting 41175 1727204642.09870: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204642.10381: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.10385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.10387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204642.10392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204642.10395: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.10461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.10464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.10466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.10502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.12237: stdout chunk (state=3): >>>/root <<< 41175 1727204642.12334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.12394: stderr chunk (state=3): >>><<< 41175 1727204642.12398: stdout chunk (state=3): >>><<< 41175 1727204642.12416: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.12428: _low_level_execute_command(): starting 41175 1727204642.12434: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462 `" && echo ansible-tmp-1727204642.124193-41594-211739239464462="` echo /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462 `" ) && sleep 0' 41175 1727204642.12932: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.12936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204642.12938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204642.12941: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204642.12943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.12994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.12998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.13045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.15085: stdout chunk (state=3): >>>ansible-tmp-1727204642.124193-41594-211739239464462=/root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462 <<< 41175 1727204642.15193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.15251: stderr chunk (state=3): >>><<< 41175 1727204642.15254: stdout chunk (state=3): >>><<< 41175 1727204642.15269: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204642.124193-41594-211739239464462=/root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.15296: variable 'ansible_module_compression' from source: unknown 41175 1727204642.15334: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204642.15351: variable 'ansible_facts' from source: unknown 41175 1727204642.15399: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/AnsiballZ_command.py 41175 1727204642.15511: Sending initial data 41175 1727204642.15515: Sent initial data (155 bytes) 41175 1727204642.16007: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.16010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.16013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204642.16016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204642.16020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.16074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.16082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.16087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.16119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.17819: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204642.17911: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204642.17915: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp5b6upkh8 /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/AnsiballZ_command.py <<< 41175 1727204642.17918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/AnsiballZ_command.py" <<< 41175 1727204642.17958: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp5b6upkh8" to remote "/root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/AnsiballZ_command.py" <<< 41175 1727204642.19026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.19046: stderr chunk (state=3): >>><<< 41175 1727204642.19050: stdout chunk (state=3): >>><<< 41175 1727204642.19099: done transferring module to remote 41175 1727204642.19102: _low_level_execute_command(): starting 41175 1727204642.19105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/ /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/AnsiballZ_command.py && sleep 0' 41175 1727204642.19760: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204642.19778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.19901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.19917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.19993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.21900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.21960: stderr chunk (state=3): >>><<< 41175 1727204642.21964: stdout chunk (state=3): >>><<< 41175 1727204642.21986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.21991: _low_level_execute_command(): starting 41175 1727204642.21997: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/AnsiballZ_command.py && sleep 0' 41175 1727204642.22496: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.22500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.22502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.22505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.22557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.22561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.22610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.40457: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:04:02.399435", "end": "2024-09-24 15:04:02.403416", "delta": "0:00:00.003981", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204642.42264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204642.42287: stderr chunk (state=3): >>><<< 41175 1727204642.42305: stdout chunk (state=3): >>><<< 41175 1727204642.42352: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:04:02.399435", "end": "2024-09-24 15:04:02.403416", "delta": "0:00:00.003981", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204642.42412: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204642.42432: _low_level_execute_command(): starting 41175 1727204642.42458: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204642.124193-41594-211739239464462/ > /dev/null 2>&1 && sleep 0' 41175 1727204642.43194: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204642.43225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.43328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.43386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.43421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.43475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.43524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.45452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.45506: stderr chunk (state=3): >>><<< 41175 1727204642.45509: stdout chunk (state=3): >>><<< 41175 1727204642.45527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.45533: handler run complete 41175 1727204642.45558: Evaluated conditional (False): False 41175 1727204642.45571: attempt loop complete, returning result 41175 1727204642.45590: variable 'item' from source: unknown 41175 1727204642.45675: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003981", "end": "2024-09-24 15:04:02.403416", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 15:04:02.399435" } 41175 1727204642.45823: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.45827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.45829: variable 'omit' from source: magic vars 41175 1727204642.45968: variable 'ansible_distribution_major_version' from source: facts 41175 1727204642.45974: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204642.46146: variable 'type' from source: set_fact 41175 1727204642.46149: variable 'state' from source: include params 41175 1727204642.46155: variable 'interface' from source: set_fact 41175 1727204642.46160: variable 'current_interfaces' from source: set_fact 41175 1727204642.46170: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41175 1727204642.46173: variable 'omit' from source: magic vars 41175 1727204642.46188: variable 'omit' from source: magic vars 41175 1727204642.46224: variable 'item' from source: unknown 41175 1727204642.46281: variable 'item' from source: unknown 41175 1727204642.46302: variable 'omit' from source: magic vars 41175 1727204642.46391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204642.46397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204642.46400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204642.46403: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204642.46405: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.46409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.46421: Set connection var ansible_shell_executable to /bin/sh 41175 1727204642.46424: Set connection var ansible_shell_type to sh 41175 1727204642.46429: Set connection var ansible_pipelining to False 41175 1727204642.46438: Set connection var ansible_timeout to 10 41175 1727204642.46448: Set connection var ansible_connection to ssh 41175 1727204642.46455: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204642.46482: variable 'ansible_shell_executable' from source: unknown 41175 1727204642.46485: variable 'ansible_connection' from source: unknown 41175 1727204642.46488: variable 'ansible_module_compression' from source: unknown 41175 1727204642.46494: variable 'ansible_shell_type' from source: unknown 41175 1727204642.46497: variable 'ansible_shell_executable' from source: unknown 41175 1727204642.46502: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.46507: variable 'ansible_pipelining' from source: unknown 41175 1727204642.46510: variable 'ansible_timeout' from source: unknown 41175 1727204642.46516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.46595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204642.46603: variable 'omit' from source: magic vars 41175 1727204642.46612: starting attempt loop 41175 1727204642.46615: running the handler 41175 1727204642.46620: _low_level_execute_command(): starting 41175 1727204642.46623: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204642.47151: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.47155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204642.47157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.47160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.47162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.47226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.47233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.47236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.47269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.48962: stdout chunk (state=3): >>>/root <<< 41175 1727204642.49188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.49195: stdout chunk (state=3): >>><<< 41175 1727204642.49197: stderr chunk (state=3): >>><<< 41175 1727204642.49305: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.49309: _low_level_execute_command(): starting 41175 1727204642.49312: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126 `" && echo ansible-tmp-1727204642.492178-41594-24408325628126="` echo /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126 `" ) && sleep 0' 41175 1727204642.49944: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204642.49973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.50078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.50114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.50136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.50162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.50238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.52226: stdout chunk (state=3): >>>ansible-tmp-1727204642.492178-41594-24408325628126=/root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126 <<< 41175 1727204642.52428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.52456: stdout chunk (state=3): >>><<< 41175 1727204642.52459: stderr chunk (state=3): >>><<< 41175 1727204642.52497: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204642.492178-41594-24408325628126=/root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.52516: variable 'ansible_module_compression' from source: unknown 41175 1727204642.52696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204642.52699: variable 'ansible_facts' from source: unknown 41175 1727204642.52702: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/AnsiballZ_command.py 41175 1727204642.52794: Sending initial data 41175 1727204642.52806: Sent initial data (154 bytes) 41175 1727204642.53251: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.53266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204642.53281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.53333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.53346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.53392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.55046: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204642.55078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204642.55115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpqbty7xz9 /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/AnsiballZ_command.py <<< 41175 1727204642.55120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/AnsiballZ_command.py" <<< 41175 1727204642.55146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpqbty7xz9" to remote "/root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/AnsiballZ_command.py" <<< 41175 1727204642.55904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.55981: stderr chunk (state=3): >>><<< 41175 1727204642.55985: stdout chunk (state=3): >>><<< 41175 1727204642.56008: done transferring module to remote 41175 1727204642.56017: _low_level_execute_command(): starting 41175 1727204642.56027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/ /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/AnsiballZ_command.py && sleep 0' 41175 1727204642.56510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.56514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204642.56519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204642.56521: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204642.56524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.56577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.56582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.56616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.58474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.58543: stderr chunk (state=3): >>><<< 41175 1727204642.58547: stdout chunk (state=3): >>><<< 41175 1727204642.58562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.58565: _low_level_execute_command(): starting 41175 1727204642.58571: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/AnsiballZ_command.py && sleep 0' 41175 1727204642.59063: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.59068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.59071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.59074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.59131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.59136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.59181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.77182: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:04:02.765928", "end": "2024-09-24 15:04:02.770372", "delta": "0:00:00.004444", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204642.78862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204642.78932: stderr chunk (state=3): >>><<< 41175 1727204642.78936: stdout chunk (state=3): >>><<< 41175 1727204642.78953: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:04:02.765928", "end": "2024-09-24 15:04:02.770372", "delta": "0:00:00.004444", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204642.78982: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204642.78988: _low_level_execute_command(): starting 41175 1727204642.78996: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204642.492178-41594-24408325628126/ > /dev/null 2>&1 && sleep 0' 41175 1727204642.79464: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.79502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204642.79505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.79508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.79510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.79566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.79572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.79574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.79616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.81556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.81612: stderr chunk (state=3): >>><<< 41175 1727204642.81615: stdout chunk (state=3): >>><<< 41175 1727204642.81632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.81638: handler run complete 41175 1727204642.81662: Evaluated conditional (False): False 41175 1727204642.81673: attempt loop complete, returning result 41175 1727204642.81692: variable 'item' from source: unknown 41175 1727204642.81762: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004444", "end": "2024-09-24 15:04:02.770372", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 15:04:02.765928" } 41175 1727204642.81899: dumping result to json 41175 1727204642.81903: done dumping result, returning 41175 1727204642.81905: done running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 [12b410aa-8751-f070-39c4-00000000026e] 41175 1727204642.81907: sending task result for task 12b410aa-8751-f070-39c4-00000000026e 41175 1727204642.81962: done sending task result for task 12b410aa-8751-f070-39c4-00000000026e 41175 1727204642.81965: WORKER PROCESS EXITING 41175 1727204642.82057: no more pending results, returning what we have 41175 1727204642.82061: results queue empty 41175 1727204642.82063: checking for any_errors_fatal 41175 1727204642.82070: done checking for any_errors_fatal 41175 1727204642.82071: checking for max_fail_percentage 41175 1727204642.82073: done checking for max_fail_percentage 41175 1727204642.82076: checking to see if all hosts have failed and the running result is not ok 41175 1727204642.82077: done checking to see if all hosts have failed 41175 1727204642.82078: getting the remaining hosts for this loop 41175 1727204642.82080: done getting the remaining hosts for this loop 41175 1727204642.82084: getting the next task for host managed-node3 41175 1727204642.82092: done getting next task for host managed-node3 41175 1727204642.82095: ^ task is: TASK: Set up veth as managed by NetworkManager 41175 1727204642.82103: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204642.82107: getting variables 41175 1727204642.82108: in VariableManager get_vars() 41175 1727204642.82150: Calling all_inventory to load vars for managed-node3 41175 1727204642.82153: Calling groups_inventory to load vars for managed-node3 41175 1727204642.82155: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204642.82167: Calling all_plugins_play to load vars for managed-node3 41175 1727204642.82170: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204642.82173: Calling groups_plugins_play to load vars for managed-node3 41175 1727204642.82376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204642.82565: done with get_vars() 41175 1727204642.82575: done getting variables 41175 1727204642.82628: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:04:02 -0400 (0:00:01.156) 0:00:09.965 ***** 41175 1727204642.82653: entering _queue_task() for managed-node3/command 41175 1727204642.82880: worker is 1 (out of 1 available) 41175 1727204642.82896: exiting _queue_task() for managed-node3/command 41175 1727204642.82908: done queuing things up, now waiting for results queue to drain 41175 1727204642.82910: waiting for pending results... 41175 1727204642.83077: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 41175 1727204642.83156: in run() - task 12b410aa-8751-f070-39c4-00000000026f 41175 1727204642.83168: variable 'ansible_search_path' from source: unknown 41175 1727204642.83172: variable 'ansible_search_path' from source: unknown 41175 1727204642.83207: calling self._execute() 41175 1727204642.83283: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.83291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.83303: variable 'omit' from source: magic vars 41175 1727204642.83634: variable 'ansible_distribution_major_version' from source: facts 41175 1727204642.83645: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204642.83784: variable 'type' from source: set_fact 41175 1727204642.83789: variable 'state' from source: include params 41175 1727204642.83797: Evaluated conditional (type == 'veth' and state == 'present'): True 41175 1727204642.83807: variable 'omit' from source: magic vars 41175 1727204642.83839: variable 'omit' from source: magic vars 41175 1727204642.83926: variable 'interface' from source: set_fact 41175 1727204642.83942: variable 'omit' from source: magic vars 41175 1727204642.83979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204642.84016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204642.84035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204642.84052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204642.84063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204642.84092: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204642.84096: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.84099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.84187: Set connection var ansible_shell_executable to /bin/sh 41175 1727204642.84192: Set connection var ansible_shell_type to sh 41175 1727204642.84198: Set connection var ansible_pipelining to False 41175 1727204642.84207: Set connection var ansible_timeout to 10 41175 1727204642.84214: Set connection var ansible_connection to ssh 41175 1727204642.84234: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204642.84248: variable 'ansible_shell_executable' from source: unknown 41175 1727204642.84252: variable 'ansible_connection' from source: unknown 41175 1727204642.84255: variable 'ansible_module_compression' from source: unknown 41175 1727204642.84257: variable 'ansible_shell_type' from source: unknown 41175 1727204642.84260: variable 'ansible_shell_executable' from source: unknown 41175 1727204642.84262: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204642.84265: variable 'ansible_pipelining' from source: unknown 41175 1727204642.84270: variable 'ansible_timeout' from source: unknown 41175 1727204642.84275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204642.84400: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204642.84410: variable 'omit' from source: magic vars 41175 1727204642.84416: starting attempt loop 41175 1727204642.84422: running the handler 41175 1727204642.84435: _low_level_execute_command(): starting 41175 1727204642.84443: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204642.85006: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.85010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.85015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204642.85018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.85074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.85078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.85085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.85124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.86816: stdout chunk (state=3): >>>/root <<< 41175 1727204642.86923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.86986: stderr chunk (state=3): >>><<< 41175 1727204642.86991: stdout chunk (state=3): >>><<< 41175 1727204642.87015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.87031: _low_level_execute_command(): starting 41175 1727204642.87037: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840 `" && echo ansible-tmp-1727204642.8701491-41785-140077298243840="` echo /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840 `" ) && sleep 0' 41175 1727204642.87526: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.87529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204642.87540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204642.87543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.87592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.87596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.87640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.89603: stdout chunk (state=3): >>>ansible-tmp-1727204642.8701491-41785-140077298243840=/root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840 <<< 41175 1727204642.89717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.89777: stderr chunk (state=3): >>><<< 41175 1727204642.89780: stdout chunk (state=3): >>><<< 41175 1727204642.89800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204642.8701491-41785-140077298243840=/root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.89836: variable 'ansible_module_compression' from source: unknown 41175 1727204642.89878: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204642.89916: variable 'ansible_facts' from source: unknown 41175 1727204642.89981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/AnsiballZ_command.py 41175 1727204642.90102: Sending initial data 41175 1727204642.90106: Sent initial data (156 bytes) 41175 1727204642.90592: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.90596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204642.90599: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.90602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204642.90604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.90660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.90667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.90702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.92293: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204642.92325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204642.92362: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpb6bgqdpn /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/AnsiballZ_command.py <<< 41175 1727204642.92370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/AnsiballZ_command.py" <<< 41175 1727204642.92395: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpb6bgqdpn" to remote "/root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/AnsiballZ_command.py" <<< 41175 1727204642.92403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/AnsiballZ_command.py" <<< 41175 1727204642.93169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.93247: stderr chunk (state=3): >>><<< 41175 1727204642.93250: stdout chunk (state=3): >>><<< 41175 1727204642.93274: done transferring module to remote 41175 1727204642.93285: _low_level_execute_command(): starting 41175 1727204642.93292: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/ /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/AnsiballZ_command.py && sleep 0' 41175 1727204642.93773: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204642.93777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.93780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.93782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.93840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204642.93843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.93884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204642.95710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204642.95768: stderr chunk (state=3): >>><<< 41175 1727204642.95772: stdout chunk (state=3): >>><<< 41175 1727204642.95787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204642.95792: _low_level_execute_command(): starting 41175 1727204642.95799: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/AnsiballZ_command.py && sleep 0' 41175 1727204642.96289: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.96300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.96303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204642.96306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204642.96354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204642.96358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204642.96411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.15935: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:04:03.137386", "end": "2024-09-24 15:04:03.158154", "delta": "0:00:00.020768", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204643.17834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204643.17838: stdout chunk (state=3): >>><<< 41175 1727204643.17840: stderr chunk (state=3): >>><<< 41175 1727204643.17996: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:04:03.137386", "end": "2024-09-24 15:04:03.158154", "delta": "0:00:00.020768", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204643.18000: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204643.18003: _low_level_execute_command(): starting 41175 1727204643.18005: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204642.8701491-41785-140077298243840/ > /dev/null 2>&1 && sleep 0' 41175 1727204643.18811: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204643.18819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204643.18907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204643.18926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.20984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204643.20988: stdout chunk (state=3): >>><<< 41175 1727204643.20996: stderr chunk (state=3): >>><<< 41175 1727204643.21015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204643.21195: handler run complete 41175 1727204643.21199: Evaluated conditional (False): False 41175 1727204643.21201: attempt loop complete, returning result 41175 1727204643.21204: _execute() done 41175 1727204643.21206: dumping result to json 41175 1727204643.21208: done dumping result, returning 41175 1727204643.21210: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-f070-39c4-00000000026f] 41175 1727204643.21212: sending task result for task 12b410aa-8751-f070-39c4-00000000026f 41175 1727204643.21295: done sending task result for task 12b410aa-8751-f070-39c4-00000000026f ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.020768", "end": "2024-09-24 15:04:03.158154", "rc": 0, "start": "2024-09-24 15:04:03.137386" } 41175 1727204643.21381: no more pending results, returning what we have 41175 1727204643.21385: results queue empty 41175 1727204643.21386: checking for any_errors_fatal 41175 1727204643.21403: done checking for any_errors_fatal 41175 1727204643.21405: checking for max_fail_percentage 41175 1727204643.21407: done checking for max_fail_percentage 41175 1727204643.21408: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.21410: done checking to see if all hosts have failed 41175 1727204643.21411: getting the remaining hosts for this loop 41175 1727204643.21412: done getting the remaining hosts for this loop 41175 1727204643.21418: getting the next task for host managed-node3 41175 1727204643.21425: done getting next task for host managed-node3 41175 1727204643.21428: ^ task is: TASK: Delete veth interface {{ interface }} 41175 1727204643.21432: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.21436: getting variables 41175 1727204643.21438: in VariableManager get_vars() 41175 1727204643.21604: Calling all_inventory to load vars for managed-node3 41175 1727204643.21608: Calling groups_inventory to load vars for managed-node3 41175 1727204643.21612: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.21628: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.21632: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.21636: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.22131: WORKER PROCESS EXITING 41175 1727204643.22159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.22510: done with get_vars() 41175 1727204643.22527: done getting variables 41175 1727204643.22605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204643.22750: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.401) 0:00:10.366 ***** 41175 1727204643.22786: entering _queue_task() for managed-node3/command 41175 1727204643.23248: worker is 1 (out of 1 available) 41175 1727204643.23262: exiting _queue_task() for managed-node3/command 41175 1727204643.23273: done queuing things up, now waiting for results queue to drain 41175 1727204643.23275: waiting for pending results... 41175 1727204643.23477: running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 41175 1727204643.23618: in run() - task 12b410aa-8751-f070-39c4-000000000270 41175 1727204643.23639: variable 'ansible_search_path' from source: unknown 41175 1727204643.23647: variable 'ansible_search_path' from source: unknown 41175 1727204643.23700: calling self._execute() 41175 1727204643.23819: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.23835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.23852: variable 'omit' from source: magic vars 41175 1727204643.24313: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.24339: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.24642: variable 'type' from source: set_fact 41175 1727204643.24658: variable 'state' from source: include params 41175 1727204643.24669: variable 'interface' from source: set_fact 41175 1727204643.24678: variable 'current_interfaces' from source: set_fact 41175 1727204643.24695: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 41175 1727204643.24705: when evaluation is False, skipping this task 41175 1727204643.24713: _execute() done 41175 1727204643.24721: dumping result to json 41175 1727204643.24729: done dumping result, returning 41175 1727204643.24743: done running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 [12b410aa-8751-f070-39c4-000000000270] 41175 1727204643.24755: sending task result for task 12b410aa-8751-f070-39c4-000000000270 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41175 1727204643.25051: no more pending results, returning what we have 41175 1727204643.25057: results queue empty 41175 1727204643.25058: checking for any_errors_fatal 41175 1727204643.25070: done checking for any_errors_fatal 41175 1727204643.25071: checking for max_fail_percentage 41175 1727204643.25073: done checking for max_fail_percentage 41175 1727204643.25074: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.25075: done checking to see if all hosts have failed 41175 1727204643.25076: getting the remaining hosts for this loop 41175 1727204643.25079: done getting the remaining hosts for this loop 41175 1727204643.25084: getting the next task for host managed-node3 41175 1727204643.25094: done getting next task for host managed-node3 41175 1727204643.25098: ^ task is: TASK: Create dummy interface {{ interface }} 41175 1727204643.25103: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.25108: getting variables 41175 1727204643.25110: in VariableManager get_vars() 41175 1727204643.25162: Calling all_inventory to load vars for managed-node3 41175 1727204643.25166: Calling groups_inventory to load vars for managed-node3 41175 1727204643.25169: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.25187: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.25396: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.25409: done sending task result for task 12b410aa-8751-f070-39c4-000000000270 41175 1727204643.25412: WORKER PROCESS EXITING 41175 1727204643.25418: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.25720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.26070: done with get_vars() 41175 1727204643.26088: done getting variables 41175 1727204643.26175: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204643.26323: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.035) 0:00:10.402 ***** 41175 1727204643.26360: entering _queue_task() for managed-node3/command 41175 1727204643.26828: worker is 1 (out of 1 available) 41175 1727204643.26844: exiting _queue_task() for managed-node3/command 41175 1727204643.26858: done queuing things up, now waiting for results queue to drain 41175 1727204643.26860: waiting for pending results... 41175 1727204643.27111: running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 41175 1727204643.27263: in run() - task 12b410aa-8751-f070-39c4-000000000271 41175 1727204643.27268: variable 'ansible_search_path' from source: unknown 41175 1727204643.27270: variable 'ansible_search_path' from source: unknown 41175 1727204643.27310: calling self._execute() 41175 1727204643.27482: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.27485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.27491: variable 'omit' from source: magic vars 41175 1727204643.27923: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.27944: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.28248: variable 'type' from source: set_fact 41175 1727204643.28260: variable 'state' from source: include params 41175 1727204643.28271: variable 'interface' from source: set_fact 41175 1727204643.28281: variable 'current_interfaces' from source: set_fact 41175 1727204643.28299: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 41175 1727204643.28357: when evaluation is False, skipping this task 41175 1727204643.28360: _execute() done 41175 1727204643.28363: dumping result to json 41175 1727204643.28366: done dumping result, returning 41175 1727204643.28368: done running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 [12b410aa-8751-f070-39c4-000000000271] 41175 1727204643.28370: sending task result for task 12b410aa-8751-f070-39c4-000000000271 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41175 1727204643.28643: no more pending results, returning what we have 41175 1727204643.28650: results queue empty 41175 1727204643.28652: checking for any_errors_fatal 41175 1727204643.28658: done checking for any_errors_fatal 41175 1727204643.28660: checking for max_fail_percentage 41175 1727204643.28662: done checking for max_fail_percentage 41175 1727204643.28663: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.28665: done checking to see if all hosts have failed 41175 1727204643.28666: getting the remaining hosts for this loop 41175 1727204643.28669: done getting the remaining hosts for this loop 41175 1727204643.28674: getting the next task for host managed-node3 41175 1727204643.28681: done getting next task for host managed-node3 41175 1727204643.28684: ^ task is: TASK: Delete dummy interface {{ interface }} 41175 1727204643.28688: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.28700: getting variables 41175 1727204643.28703: in VariableManager get_vars() 41175 1727204643.28753: Calling all_inventory to load vars for managed-node3 41175 1727204643.28757: Calling groups_inventory to load vars for managed-node3 41175 1727204643.28760: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.28904: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.28908: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.28920: done sending task result for task 12b410aa-8751-f070-39c4-000000000271 41175 1727204643.28923: WORKER PROCESS EXITING 41175 1727204643.28927: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.29284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.29496: done with get_vars() 41175 1727204643.29509: done getting variables 41175 1727204643.29561: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204643.29660: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.033) 0:00:10.435 ***** 41175 1727204643.29688: entering _queue_task() for managed-node3/command 41175 1727204643.29929: worker is 1 (out of 1 available) 41175 1727204643.29945: exiting _queue_task() for managed-node3/command 41175 1727204643.29959: done queuing things up, now waiting for results queue to drain 41175 1727204643.29961: waiting for pending results... 41175 1727204643.30149: running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 41175 1727204643.30230: in run() - task 12b410aa-8751-f070-39c4-000000000272 41175 1727204643.30240: variable 'ansible_search_path' from source: unknown 41175 1727204643.30244: variable 'ansible_search_path' from source: unknown 41175 1727204643.30277: calling self._execute() 41175 1727204643.30359: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.30367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.30377: variable 'omit' from source: magic vars 41175 1727204643.30699: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.30711: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.30892: variable 'type' from source: set_fact 41175 1727204643.30896: variable 'state' from source: include params 41175 1727204643.30899: variable 'interface' from source: set_fact 41175 1727204643.30905: variable 'current_interfaces' from source: set_fact 41175 1727204643.30914: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 41175 1727204643.30917: when evaluation is False, skipping this task 41175 1727204643.30923: _execute() done 41175 1727204643.30927: dumping result to json 41175 1727204643.30932: done dumping result, returning 41175 1727204643.30939: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 [12b410aa-8751-f070-39c4-000000000272] 41175 1727204643.30951: sending task result for task 12b410aa-8751-f070-39c4-000000000272 41175 1727204643.31039: done sending task result for task 12b410aa-8751-f070-39c4-000000000272 41175 1727204643.31042: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41175 1727204643.31111: no more pending results, returning what we have 41175 1727204643.31116: results queue empty 41175 1727204643.31118: checking for any_errors_fatal 41175 1727204643.31126: done checking for any_errors_fatal 41175 1727204643.31127: checking for max_fail_percentage 41175 1727204643.31129: done checking for max_fail_percentage 41175 1727204643.31130: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.31131: done checking to see if all hosts have failed 41175 1727204643.31132: getting the remaining hosts for this loop 41175 1727204643.31134: done getting the remaining hosts for this loop 41175 1727204643.31139: getting the next task for host managed-node3 41175 1727204643.31145: done getting next task for host managed-node3 41175 1727204643.31150: ^ task is: TASK: Create tap interface {{ interface }} 41175 1727204643.31153: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.31157: getting variables 41175 1727204643.31159: in VariableManager get_vars() 41175 1727204643.31202: Calling all_inventory to load vars for managed-node3 41175 1727204643.31205: Calling groups_inventory to load vars for managed-node3 41175 1727204643.31208: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.31220: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.31223: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.31227: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.31554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.31897: done with get_vars() 41175 1727204643.31911: done getting variables 41175 1727204643.31992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204643.32135: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.024) 0:00:10.460 ***** 41175 1727204643.32179: entering _queue_task() for managed-node3/command 41175 1727204643.32552: worker is 1 (out of 1 available) 41175 1727204643.32569: exiting _queue_task() for managed-node3/command 41175 1727204643.32582: done queuing things up, now waiting for results queue to drain 41175 1727204643.32584: waiting for pending results... 41175 1727204643.32776: running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 41175 1727204643.32862: in run() - task 12b410aa-8751-f070-39c4-000000000273 41175 1727204643.32874: variable 'ansible_search_path' from source: unknown 41175 1727204643.32878: variable 'ansible_search_path' from source: unknown 41175 1727204643.32914: calling self._execute() 41175 1727204643.32995: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.33002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.33012: variable 'omit' from source: magic vars 41175 1727204643.33342: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.33355: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.33538: variable 'type' from source: set_fact 41175 1727204643.33542: variable 'state' from source: include params 41175 1727204643.33547: variable 'interface' from source: set_fact 41175 1727204643.33553: variable 'current_interfaces' from source: set_fact 41175 1727204643.33561: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 41175 1727204643.33564: when evaluation is False, skipping this task 41175 1727204643.33567: _execute() done 41175 1727204643.33574: dumping result to json 41175 1727204643.33577: done dumping result, returning 41175 1727204643.33586: done running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 [12b410aa-8751-f070-39c4-000000000273] 41175 1727204643.33595: sending task result for task 12b410aa-8751-f070-39c4-000000000273 41175 1727204643.33690: done sending task result for task 12b410aa-8751-f070-39c4-000000000273 41175 1727204643.33693: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41175 1727204643.33748: no more pending results, returning what we have 41175 1727204643.33753: results queue empty 41175 1727204643.33754: checking for any_errors_fatal 41175 1727204643.33760: done checking for any_errors_fatal 41175 1727204643.33761: checking for max_fail_percentage 41175 1727204643.33763: done checking for max_fail_percentage 41175 1727204643.33764: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.33765: done checking to see if all hosts have failed 41175 1727204643.33767: getting the remaining hosts for this loop 41175 1727204643.33769: done getting the remaining hosts for this loop 41175 1727204643.33774: getting the next task for host managed-node3 41175 1727204643.33781: done getting next task for host managed-node3 41175 1727204643.33784: ^ task is: TASK: Delete tap interface {{ interface }} 41175 1727204643.33787: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.33799: getting variables 41175 1727204643.33801: in VariableManager get_vars() 41175 1727204643.33843: Calling all_inventory to load vars for managed-node3 41175 1727204643.33846: Calling groups_inventory to load vars for managed-node3 41175 1727204643.33848: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.33860: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.33863: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.33867: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.34054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.34246: done with get_vars() 41175 1727204643.34256: done getting variables 41175 1727204643.34307: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204643.34404: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.022) 0:00:10.483 ***** 41175 1727204643.34429: entering _queue_task() for managed-node3/command 41175 1727204643.34679: worker is 1 (out of 1 available) 41175 1727204643.34699: exiting _queue_task() for managed-node3/command 41175 1727204643.34715: done queuing things up, now waiting for results queue to drain 41175 1727204643.34717: waiting for pending results... 41175 1727204643.34998: running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 41175 1727204643.35135: in run() - task 12b410aa-8751-f070-39c4-000000000274 41175 1727204643.35156: variable 'ansible_search_path' from source: unknown 41175 1727204643.35164: variable 'ansible_search_path' from source: unknown 41175 1727204643.35211: calling self._execute() 41175 1727204643.35330: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.35345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.35362: variable 'omit' from source: magic vars 41175 1727204643.35962: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.35966: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.36221: variable 'type' from source: set_fact 41175 1727204643.36234: variable 'state' from source: include params 41175 1727204643.36244: variable 'interface' from source: set_fact 41175 1727204643.36254: variable 'current_interfaces' from source: set_fact 41175 1727204643.36269: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 41175 1727204643.36278: when evaluation is False, skipping this task 41175 1727204643.36395: _execute() done 41175 1727204643.36398: dumping result to json 41175 1727204643.36401: done dumping result, returning 41175 1727204643.36403: done running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 [12b410aa-8751-f070-39c4-000000000274] 41175 1727204643.36405: sending task result for task 12b410aa-8751-f070-39c4-000000000274 41175 1727204643.36482: done sending task result for task 12b410aa-8751-f070-39c4-000000000274 41175 1727204643.36485: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41175 1727204643.36544: no more pending results, returning what we have 41175 1727204643.36549: results queue empty 41175 1727204643.36551: checking for any_errors_fatal 41175 1727204643.36558: done checking for any_errors_fatal 41175 1727204643.36560: checking for max_fail_percentage 41175 1727204643.36562: done checking for max_fail_percentage 41175 1727204643.36563: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.36564: done checking to see if all hosts have failed 41175 1727204643.36566: getting the remaining hosts for this loop 41175 1727204643.36570: done getting the remaining hosts for this loop 41175 1727204643.36575: getting the next task for host managed-node3 41175 1727204643.36586: done getting next task for host managed-node3 41175 1727204643.36592: ^ task is: TASK: Include the task 'assert_device_present.yml' 41175 1727204643.36596: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.36601: getting variables 41175 1727204643.36602: in VariableManager get_vars() 41175 1727204643.36649: Calling all_inventory to load vars for managed-node3 41175 1727204643.36652: Calling groups_inventory to load vars for managed-node3 41175 1727204643.36655: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.36671: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.36674: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.36677: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.36932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.37115: done with get_vars() 41175 1727204643.37125: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:21 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.027) 0:00:10.510 ***** 41175 1727204643.37205: entering _queue_task() for managed-node3/include_tasks 41175 1727204643.37440: worker is 1 (out of 1 available) 41175 1727204643.37455: exiting _queue_task() for managed-node3/include_tasks 41175 1727204643.37468: done queuing things up, now waiting for results queue to drain 41175 1727204643.37471: waiting for pending results... 41175 1727204643.37654: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 41175 1727204643.37734: in run() - task 12b410aa-8751-f070-39c4-00000000000e 41175 1727204643.37744: variable 'ansible_search_path' from source: unknown 41175 1727204643.37777: calling self._execute() 41175 1727204643.37860: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.37865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.37876: variable 'omit' from source: magic vars 41175 1727204643.38197: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.38209: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.38215: _execute() done 41175 1727204643.38222: dumping result to json 41175 1727204643.38227: done dumping result, returning 41175 1727204643.38307: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [12b410aa-8751-f070-39c4-00000000000e] 41175 1727204643.38311: sending task result for task 12b410aa-8751-f070-39c4-00000000000e 41175 1727204643.38387: done sending task result for task 12b410aa-8751-f070-39c4-00000000000e 41175 1727204643.38511: WORKER PROCESS EXITING 41175 1727204643.38611: no more pending results, returning what we have 41175 1727204643.38616: in VariableManager get_vars() 41175 1727204643.38670: Calling all_inventory to load vars for managed-node3 41175 1727204643.38673: Calling groups_inventory to load vars for managed-node3 41175 1727204643.38676: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.38688: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.38696: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.38700: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.38983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.39355: done with get_vars() 41175 1727204643.39364: variable 'ansible_search_path' from source: unknown 41175 1727204643.39378: we have included files to process 41175 1727204643.39379: generating all_blocks data 41175 1727204643.39380: done generating all_blocks data 41175 1727204643.39390: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41175 1727204643.39394: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41175 1727204643.39403: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41175 1727204643.39613: in VariableManager get_vars() 41175 1727204643.39652: done with get_vars() 41175 1727204643.39801: done processing included file 41175 1727204643.39804: iterating over new_blocks loaded from include file 41175 1727204643.39806: in VariableManager get_vars() 41175 1727204643.39830: done with get_vars() 41175 1727204643.39832: filtering new block on tags 41175 1727204643.39858: done filtering new block on tags 41175 1727204643.39861: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 41175 1727204643.39868: extending task lists for all hosts with included blocks 41175 1727204643.42781: done extending task lists 41175 1727204643.42783: done processing included files 41175 1727204643.42784: results queue empty 41175 1727204643.42784: checking for any_errors_fatal 41175 1727204643.42787: done checking for any_errors_fatal 41175 1727204643.42788: checking for max_fail_percentage 41175 1727204643.42791: done checking for max_fail_percentage 41175 1727204643.42792: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.42792: done checking to see if all hosts have failed 41175 1727204643.42793: getting the remaining hosts for this loop 41175 1727204643.42794: done getting the remaining hosts for this loop 41175 1727204643.42796: getting the next task for host managed-node3 41175 1727204643.42799: done getting next task for host managed-node3 41175 1727204643.42801: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41175 1727204643.42804: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.42806: getting variables 41175 1727204643.42806: in VariableManager get_vars() 41175 1727204643.42826: Calling all_inventory to load vars for managed-node3 41175 1727204643.42828: Calling groups_inventory to load vars for managed-node3 41175 1727204643.42830: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.42836: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.42838: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.42840: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.42994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.43174: done with get_vars() 41175 1727204643.43184: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.060) 0:00:10.571 ***** 41175 1727204643.43252: entering _queue_task() for managed-node3/include_tasks 41175 1727204643.43517: worker is 1 (out of 1 available) 41175 1727204643.43531: exiting _queue_task() for managed-node3/include_tasks 41175 1727204643.43544: done queuing things up, now waiting for results queue to drain 41175 1727204643.43546: waiting for pending results... 41175 1727204643.43733: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 41175 1727204643.43818: in run() - task 12b410aa-8751-f070-39c4-0000000003e0 41175 1727204643.43834: variable 'ansible_search_path' from source: unknown 41175 1727204643.43838: variable 'ansible_search_path' from source: unknown 41175 1727204643.44095: calling self._execute() 41175 1727204643.44098: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.44101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.44104: variable 'omit' from source: magic vars 41175 1727204643.44474: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.44498: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.44511: _execute() done 41175 1727204643.44524: dumping result to json 41175 1727204643.44533: done dumping result, returning 41175 1727204643.44543: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-f070-39c4-0000000003e0] 41175 1727204643.44555: sending task result for task 12b410aa-8751-f070-39c4-0000000003e0 41175 1727204643.44700: no more pending results, returning what we have 41175 1727204643.44707: in VariableManager get_vars() 41175 1727204643.44769: Calling all_inventory to load vars for managed-node3 41175 1727204643.44774: Calling groups_inventory to load vars for managed-node3 41175 1727204643.44777: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.44798: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.44801: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.44806: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.45258: done sending task result for task 12b410aa-8751-f070-39c4-0000000003e0 41175 1727204643.45262: WORKER PROCESS EXITING 41175 1727204643.45280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.45623: done with get_vars() 41175 1727204643.45640: variable 'ansible_search_path' from source: unknown 41175 1727204643.45641: variable 'ansible_search_path' from source: unknown 41175 1727204643.45686: we have included files to process 41175 1727204643.45688: generating all_blocks data 41175 1727204643.45693: done generating all_blocks data 41175 1727204643.45695: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41175 1727204643.45696: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41175 1727204643.45699: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41175 1727204643.46027: done processing included file 41175 1727204643.46030: iterating over new_blocks loaded from include file 41175 1727204643.46032: in VariableManager get_vars() 41175 1727204643.46059: done with get_vars() 41175 1727204643.46066: filtering new block on tags 41175 1727204643.46088: done filtering new block on tags 41175 1727204643.46094: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 41175 1727204643.46101: extending task lists for all hosts with included blocks 41175 1727204643.46256: done extending task lists 41175 1727204643.46257: done processing included files 41175 1727204643.46258: results queue empty 41175 1727204643.46259: checking for any_errors_fatal 41175 1727204643.46264: done checking for any_errors_fatal 41175 1727204643.46265: checking for max_fail_percentage 41175 1727204643.46267: done checking for max_fail_percentage 41175 1727204643.46268: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.46269: done checking to see if all hosts have failed 41175 1727204643.46270: getting the remaining hosts for this loop 41175 1727204643.46272: done getting the remaining hosts for this loop 41175 1727204643.46275: getting the next task for host managed-node3 41175 1727204643.46286: done getting next task for host managed-node3 41175 1727204643.46289: ^ task is: TASK: Get stat for interface {{ interface }} 41175 1727204643.46295: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.46299: getting variables 41175 1727204643.46300: in VariableManager get_vars() 41175 1727204643.46321: Calling all_inventory to load vars for managed-node3 41175 1727204643.46324: Calling groups_inventory to load vars for managed-node3 41175 1727204643.46327: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.46334: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.46337: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.46341: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.46613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.46939: done with get_vars() 41175 1727204643.46953: done getting variables 41175 1727204643.47164: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.039) 0:00:10.610 ***** 41175 1727204643.47203: entering _queue_task() for managed-node3/stat 41175 1727204643.47708: worker is 1 (out of 1 available) 41175 1727204643.47724: exiting _queue_task() for managed-node3/stat 41175 1727204643.47736: done queuing things up, now waiting for results queue to drain 41175 1727204643.47739: waiting for pending results... 41175 1727204643.48027: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 41175 1727204643.48153: in run() - task 12b410aa-8751-f070-39c4-0000000004ff 41175 1727204643.48232: variable 'ansible_search_path' from source: unknown 41175 1727204643.48236: variable 'ansible_search_path' from source: unknown 41175 1727204643.48253: calling self._execute() 41175 1727204643.48378: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.48395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.48414: variable 'omit' from source: magic vars 41175 1727204643.48926: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.48998: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.49002: variable 'omit' from source: magic vars 41175 1727204643.49040: variable 'omit' from source: magic vars 41175 1727204643.49191: variable 'interface' from source: set_fact 41175 1727204643.49235: variable 'omit' from source: magic vars 41175 1727204643.49291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204643.49355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204643.49426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204643.49435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204643.49451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204643.49504: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204643.49520: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.49547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.49753: Set connection var ansible_shell_executable to /bin/sh 41175 1727204643.49760: Set connection var ansible_shell_type to sh 41175 1727204643.49763: Set connection var ansible_pipelining to False 41175 1727204643.49766: Set connection var ansible_timeout to 10 41175 1727204643.49770: Set connection var ansible_connection to ssh 41175 1727204643.49772: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204643.49862: variable 'ansible_shell_executable' from source: unknown 41175 1727204643.49867: variable 'ansible_connection' from source: unknown 41175 1727204643.49870: variable 'ansible_module_compression' from source: unknown 41175 1727204643.49872: variable 'ansible_shell_type' from source: unknown 41175 1727204643.49875: variable 'ansible_shell_executable' from source: unknown 41175 1727204643.49877: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.49887: variable 'ansible_pipelining' from source: unknown 41175 1727204643.49890: variable 'ansible_timeout' from source: unknown 41175 1727204643.49894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.50161: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204643.50189: variable 'omit' from source: magic vars 41175 1727204643.50215: starting attempt loop 41175 1727204643.50222: running the handler 41175 1727204643.50230: _low_level_execute_command(): starting 41175 1727204643.50238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204643.50766: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204643.50781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.50803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.50866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204643.50870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204643.50877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204643.50920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.52669: stdout chunk (state=3): >>>/root <<< 41175 1727204643.52796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204643.52850: stderr chunk (state=3): >>><<< 41175 1727204643.52853: stdout chunk (state=3): >>><<< 41175 1727204643.52869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204643.52893: _low_level_execute_command(): starting 41175 1727204643.52905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822 `" && echo ansible-tmp-1727204643.5287738-41812-159224318824822="` echo /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822 `" ) && sleep 0' 41175 1727204643.53380: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204643.53420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204643.53425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.53427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204643.53439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204643.53442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.53488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204643.53495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204643.53536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.55549: stdout chunk (state=3): >>>ansible-tmp-1727204643.5287738-41812-159224318824822=/root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822 <<< 41175 1727204643.55660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204643.55727: stderr chunk (state=3): >>><<< 41175 1727204643.55731: stdout chunk (state=3): >>><<< 41175 1727204643.55751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204643.5287738-41812-159224318824822=/root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204643.55798: variable 'ansible_module_compression' from source: unknown 41175 1727204643.55852: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41175 1727204643.55885: variable 'ansible_facts' from source: unknown 41175 1727204643.55955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/AnsiballZ_stat.py 41175 1727204643.56078: Sending initial data 41175 1727204643.56081: Sent initial data (153 bytes) 41175 1727204643.56555: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204643.56603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204643.56606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.56609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204643.56611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204643.56613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.56664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204643.56671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204643.56674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204643.56711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.58328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204643.58357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204643.58393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmph4m1yz12 /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/AnsiballZ_stat.py <<< 41175 1727204643.58406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/AnsiballZ_stat.py" <<< 41175 1727204643.58428: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmph4m1yz12" to remote "/root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/AnsiballZ_stat.py" <<< 41175 1727204643.58435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/AnsiballZ_stat.py" <<< 41175 1727204643.59207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204643.59285: stderr chunk (state=3): >>><<< 41175 1727204643.59288: stdout chunk (state=3): >>><<< 41175 1727204643.59315: done transferring module to remote 41175 1727204643.59327: _low_level_execute_command(): starting 41175 1727204643.59331: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/ /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/AnsiballZ_stat.py && sleep 0' 41175 1727204643.59794: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204643.59835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204643.59838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204643.59841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.59843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204643.59845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204643.59851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.59908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204643.59916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204643.59918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204643.59952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.62001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204643.62062: stderr chunk (state=3): >>><<< 41175 1727204643.62065: stdout chunk (state=3): >>><<< 41175 1727204643.62080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204643.62083: _low_level_execute_command(): starting 41175 1727204643.62091: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/AnsiballZ_stat.py && sleep 0' 41175 1727204643.62600: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204643.62604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.62607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204643.62610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.62673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204643.62676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204643.62679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204643.62725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.80521: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39044, "dev": 23, "nlink": 1, "atime": 1727204641.9979906, "mtime": 1727204641.9979906, "ctime": 1727204641.9979906, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41175 1727204643.82005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204643.82064: stderr chunk (state=3): >>><<< 41175 1727204643.82068: stdout chunk (state=3): >>><<< 41175 1727204643.82084: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39044, "dev": 23, "nlink": 1, "atime": 1727204641.9979906, "mtime": 1727204641.9979906, "ctime": 1727204641.9979906, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204643.82144: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204643.82154: _low_level_execute_command(): starting 41175 1727204643.82162: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204643.5287738-41812-159224318824822/ > /dev/null 2>&1 && sleep 0' 41175 1727204643.82651: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204643.82655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.82657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204643.82660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204643.82703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204643.82721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204643.82764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204643.84730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204643.84787: stderr chunk (state=3): >>><<< 41175 1727204643.84793: stdout chunk (state=3): >>><<< 41175 1727204643.84810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204643.84821: handler run complete 41175 1727204643.84866: attempt loop complete, returning result 41175 1727204643.84870: _execute() done 41175 1727204643.84872: dumping result to json 41175 1727204643.84879: done dumping result, returning 41175 1727204643.84888: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [12b410aa-8751-f070-39c4-0000000004ff] 41175 1727204643.84896: sending task result for task 12b410aa-8751-f070-39c4-0000000004ff 41175 1727204643.85022: done sending task result for task 12b410aa-8751-f070-39c4-0000000004ff 41175 1727204643.85025: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204641.9979906, "block_size": 4096, "blocks": 0, "ctime": 1727204641.9979906, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 39044, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204641.9979906, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 41175 1727204643.85158: no more pending results, returning what we have 41175 1727204643.85162: results queue empty 41175 1727204643.85163: checking for any_errors_fatal 41175 1727204643.85165: done checking for any_errors_fatal 41175 1727204643.85166: checking for max_fail_percentage 41175 1727204643.85168: done checking for max_fail_percentage 41175 1727204643.85169: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.85170: done checking to see if all hosts have failed 41175 1727204643.85171: getting the remaining hosts for this loop 41175 1727204643.85172: done getting the remaining hosts for this loop 41175 1727204643.85176: getting the next task for host managed-node3 41175 1727204643.85183: done getting next task for host managed-node3 41175 1727204643.85186: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 41175 1727204643.85191: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.85195: getting variables 41175 1727204643.85196: in VariableManager get_vars() 41175 1727204643.85247: Calling all_inventory to load vars for managed-node3 41175 1727204643.85250: Calling groups_inventory to load vars for managed-node3 41175 1727204643.85253: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.85264: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.85267: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.85270: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.85448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.85659: done with get_vars() 41175 1727204643.85669: done getting variables 41175 1727204643.85756: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 41175 1727204643.85860: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.386) 0:00:10.997 ***** 41175 1727204643.85887: entering _queue_task() for managed-node3/assert 41175 1727204643.85890: Creating lock for assert 41175 1727204643.86140: worker is 1 (out of 1 available) 41175 1727204643.86156: exiting _queue_task() for managed-node3/assert 41175 1727204643.86167: done queuing things up, now waiting for results queue to drain 41175 1727204643.86169: waiting for pending results... 41175 1727204643.86355: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' 41175 1727204643.86434: in run() - task 12b410aa-8751-f070-39c4-0000000003e1 41175 1727204643.86447: variable 'ansible_search_path' from source: unknown 41175 1727204643.86452: variable 'ansible_search_path' from source: unknown 41175 1727204643.86485: calling self._execute() 41175 1727204643.86562: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.86569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.86579: variable 'omit' from source: magic vars 41175 1727204643.86896: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.86909: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.86920: variable 'omit' from source: magic vars 41175 1727204643.86955: variable 'omit' from source: magic vars 41175 1727204643.87040: variable 'interface' from source: set_fact 41175 1727204643.87059: variable 'omit' from source: magic vars 41175 1727204643.87097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204643.87128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204643.87146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204643.87165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204643.87178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204643.87208: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204643.87212: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.87220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.87307: Set connection var ansible_shell_executable to /bin/sh 41175 1727204643.87311: Set connection var ansible_shell_type to sh 41175 1727204643.87320: Set connection var ansible_pipelining to False 41175 1727204643.87328: Set connection var ansible_timeout to 10 41175 1727204643.87334: Set connection var ansible_connection to ssh 41175 1727204643.87340: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204643.87361: variable 'ansible_shell_executable' from source: unknown 41175 1727204643.87364: variable 'ansible_connection' from source: unknown 41175 1727204643.87367: variable 'ansible_module_compression' from source: unknown 41175 1727204643.87374: variable 'ansible_shell_type' from source: unknown 41175 1727204643.87377: variable 'ansible_shell_executable' from source: unknown 41175 1727204643.87379: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.87392: variable 'ansible_pipelining' from source: unknown 41175 1727204643.87394: variable 'ansible_timeout' from source: unknown 41175 1727204643.87399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.87519: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204643.87529: variable 'omit' from source: magic vars 41175 1727204643.87536: starting attempt loop 41175 1727204643.87539: running the handler 41175 1727204643.87654: variable 'interface_stat' from source: set_fact 41175 1727204643.87672: Evaluated conditional (interface_stat.stat.exists): True 41175 1727204643.87680: handler run complete 41175 1727204643.87696: attempt loop complete, returning result 41175 1727204643.87699: _execute() done 41175 1727204643.87701: dumping result to json 41175 1727204643.87706: done dumping result, returning 41175 1727204643.87723: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' [12b410aa-8751-f070-39c4-0000000003e1] 41175 1727204643.87726: sending task result for task 12b410aa-8751-f070-39c4-0000000003e1 41175 1727204643.87812: done sending task result for task 12b410aa-8751-f070-39c4-0000000003e1 41175 1727204643.87815: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41175 1727204643.87883: no more pending results, returning what we have 41175 1727204643.87887: results queue empty 41175 1727204643.87891: checking for any_errors_fatal 41175 1727204643.87900: done checking for any_errors_fatal 41175 1727204643.87901: checking for max_fail_percentage 41175 1727204643.87903: done checking for max_fail_percentage 41175 1727204643.87904: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.87905: done checking to see if all hosts have failed 41175 1727204643.87906: getting the remaining hosts for this loop 41175 1727204643.87908: done getting the remaining hosts for this loop 41175 1727204643.87913: getting the next task for host managed-node3 41175 1727204643.87925: done getting next task for host managed-node3 41175 1727204643.87933: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204643.87936: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.87952: getting variables 41175 1727204643.87954: in VariableManager get_vars() 41175 1727204643.88002: Calling all_inventory to load vars for managed-node3 41175 1727204643.88005: Calling groups_inventory to load vars for managed-node3 41175 1727204643.88008: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.88022: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.88025: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.88029: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.88198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.88393: done with get_vars() 41175 1727204643.88403: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.025) 0:00:11.023 ***** 41175 1727204643.88486: entering _queue_task() for managed-node3/include_tasks 41175 1727204643.88728: worker is 1 (out of 1 available) 41175 1727204643.88751: exiting _queue_task() for managed-node3/include_tasks 41175 1727204643.88765: done queuing things up, now waiting for results queue to drain 41175 1727204643.88767: waiting for pending results... 41175 1727204643.88944: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204643.89042: in run() - task 12b410aa-8751-f070-39c4-000000000016 41175 1727204643.89056: variable 'ansible_search_path' from source: unknown 41175 1727204643.89060: variable 'ansible_search_path' from source: unknown 41175 1727204643.89096: calling self._execute() 41175 1727204643.89165: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.89171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.89183: variable 'omit' from source: magic vars 41175 1727204643.89492: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.89502: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.89509: _execute() done 41175 1727204643.89514: dumping result to json 41175 1727204643.89520: done dumping result, returning 41175 1727204643.89527: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-f070-39c4-000000000016] 41175 1727204643.89533: sending task result for task 12b410aa-8751-f070-39c4-000000000016 41175 1727204643.89636: done sending task result for task 12b410aa-8751-f070-39c4-000000000016 41175 1727204643.89639: WORKER PROCESS EXITING 41175 1727204643.89690: no more pending results, returning what we have 41175 1727204643.89695: in VariableManager get_vars() 41175 1727204643.89743: Calling all_inventory to load vars for managed-node3 41175 1727204643.89746: Calling groups_inventory to load vars for managed-node3 41175 1727204643.89749: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.89759: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.89762: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.89765: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.89973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.90154: done with get_vars() 41175 1727204643.90161: variable 'ansible_search_path' from source: unknown 41175 1727204643.90162: variable 'ansible_search_path' from source: unknown 41175 1727204643.90195: we have included files to process 41175 1727204643.90196: generating all_blocks data 41175 1727204643.90197: done generating all_blocks data 41175 1727204643.90200: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204643.90201: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204643.90203: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204643.90774: done processing included file 41175 1727204643.90776: iterating over new_blocks loaded from include file 41175 1727204643.90777: in VariableManager get_vars() 41175 1727204643.90802: done with get_vars() 41175 1727204643.90803: filtering new block on tags 41175 1727204643.90819: done filtering new block on tags 41175 1727204643.90821: in VariableManager get_vars() 41175 1727204643.90839: done with get_vars() 41175 1727204643.90840: filtering new block on tags 41175 1727204643.90855: done filtering new block on tags 41175 1727204643.90857: in VariableManager get_vars() 41175 1727204643.90874: done with get_vars() 41175 1727204643.90875: filtering new block on tags 41175 1727204643.90888: done filtering new block on tags 41175 1727204643.90892: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41175 1727204643.90897: extending task lists for all hosts with included blocks 41175 1727204643.91745: done extending task lists 41175 1727204643.91747: done processing included files 41175 1727204643.91748: results queue empty 41175 1727204643.91748: checking for any_errors_fatal 41175 1727204643.91750: done checking for any_errors_fatal 41175 1727204643.91751: checking for max_fail_percentage 41175 1727204643.91752: done checking for max_fail_percentage 41175 1727204643.91752: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.91753: done checking to see if all hosts have failed 41175 1727204643.91754: getting the remaining hosts for this loop 41175 1727204643.91755: done getting the remaining hosts for this loop 41175 1727204643.91757: getting the next task for host managed-node3 41175 1727204643.91760: done getting next task for host managed-node3 41175 1727204643.91762: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204643.91765: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.91774: getting variables 41175 1727204643.91775: in VariableManager get_vars() 41175 1727204643.91787: Calling all_inventory to load vars for managed-node3 41175 1727204643.91790: Calling groups_inventory to load vars for managed-node3 41175 1727204643.91792: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.91797: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.91799: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.91801: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.91927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.92107: done with get_vars() 41175 1727204643.92115: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.036) 0:00:11.060 ***** 41175 1727204643.92172: entering _queue_task() for managed-node3/setup 41175 1727204643.92423: worker is 1 (out of 1 available) 41175 1727204643.92437: exiting _queue_task() for managed-node3/setup 41175 1727204643.92451: done queuing things up, now waiting for results queue to drain 41175 1727204643.92453: waiting for pending results... 41175 1727204643.92638: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204643.92753: in run() - task 12b410aa-8751-f070-39c4-000000000517 41175 1727204643.92765: variable 'ansible_search_path' from source: unknown 41175 1727204643.92770: variable 'ansible_search_path' from source: unknown 41175 1727204643.92808: calling self._execute() 41175 1727204643.92880: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.92887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.92901: variable 'omit' from source: magic vars 41175 1727204643.93232: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.93244: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.93430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204643.95191: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204643.95250: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204643.95281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204643.95319: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204643.95341: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204643.95411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204643.95452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204643.95473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204643.95509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204643.95536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204643.95573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204643.95594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204643.95615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204643.95652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204643.95665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204643.95804: variable '__network_required_facts' from source: role '' defaults 41175 1727204643.95813: variable 'ansible_facts' from source: unknown 41175 1727204643.95899: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41175 1727204643.95903: when evaluation is False, skipping this task 41175 1727204643.95906: _execute() done 41175 1727204643.95909: dumping result to json 41175 1727204643.95912: done dumping result, returning 41175 1727204643.95942: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-f070-39c4-000000000517] 41175 1727204643.95945: sending task result for task 12b410aa-8751-f070-39c4-000000000517 41175 1727204643.96027: done sending task result for task 12b410aa-8751-f070-39c4-000000000517 41175 1727204643.96030: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204643.96094: no more pending results, returning what we have 41175 1727204643.96098: results queue empty 41175 1727204643.96099: checking for any_errors_fatal 41175 1727204643.96101: done checking for any_errors_fatal 41175 1727204643.96102: checking for max_fail_percentage 41175 1727204643.96104: done checking for max_fail_percentage 41175 1727204643.96105: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.96106: done checking to see if all hosts have failed 41175 1727204643.96106: getting the remaining hosts for this loop 41175 1727204643.96108: done getting the remaining hosts for this loop 41175 1727204643.96114: getting the next task for host managed-node3 41175 1727204643.96126: done getting next task for host managed-node3 41175 1727204643.96130: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204643.96134: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.96151: getting variables 41175 1727204643.96153: in VariableManager get_vars() 41175 1727204643.96199: Calling all_inventory to load vars for managed-node3 41175 1727204643.96203: Calling groups_inventory to load vars for managed-node3 41175 1727204643.96205: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.96218: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.96222: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.96225: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.96440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.96641: done with get_vars() 41175 1727204643.96651: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.045) 0:00:11.106 ***** 41175 1727204643.96739: entering _queue_task() for managed-node3/stat 41175 1727204643.96969: worker is 1 (out of 1 available) 41175 1727204643.96984: exiting _queue_task() for managed-node3/stat 41175 1727204643.97000: done queuing things up, now waiting for results queue to drain 41175 1727204643.97002: waiting for pending results... 41175 1727204643.97178: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204643.97305: in run() - task 12b410aa-8751-f070-39c4-000000000519 41175 1727204643.97319: variable 'ansible_search_path' from source: unknown 41175 1727204643.97324: variable 'ansible_search_path' from source: unknown 41175 1727204643.97365: calling self._execute() 41175 1727204643.97448: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.97453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.97466: variable 'omit' from source: magic vars 41175 1727204643.97797: variable 'ansible_distribution_major_version' from source: facts 41175 1727204643.97805: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204643.97959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204643.98193: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204643.98236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204643.98267: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204643.98298: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204643.98375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204643.98399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204643.98422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204643.98448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204643.98528: variable '__network_is_ostree' from source: set_fact 41175 1727204643.98535: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204643.98538: when evaluation is False, skipping this task 41175 1727204643.98541: _execute() done 41175 1727204643.98548: dumping result to json 41175 1727204643.98550: done dumping result, returning 41175 1727204643.98561: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-f070-39c4-000000000519] 41175 1727204643.98564: sending task result for task 12b410aa-8751-f070-39c4-000000000519 41175 1727204643.98654: done sending task result for task 12b410aa-8751-f070-39c4-000000000519 41175 1727204643.98657: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204643.98732: no more pending results, returning what we have 41175 1727204643.98737: results queue empty 41175 1727204643.98738: checking for any_errors_fatal 41175 1727204643.98746: done checking for any_errors_fatal 41175 1727204643.98747: checking for max_fail_percentage 41175 1727204643.98749: done checking for max_fail_percentage 41175 1727204643.98750: checking to see if all hosts have failed and the running result is not ok 41175 1727204643.98752: done checking to see if all hosts have failed 41175 1727204643.98753: getting the remaining hosts for this loop 41175 1727204643.98754: done getting the remaining hosts for this loop 41175 1727204643.98759: getting the next task for host managed-node3 41175 1727204643.98766: done getting next task for host managed-node3 41175 1727204643.98772: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204643.98776: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204643.98793: getting variables 41175 1727204643.98795: in VariableManager get_vars() 41175 1727204643.98833: Calling all_inventory to load vars for managed-node3 41175 1727204643.98836: Calling groups_inventory to load vars for managed-node3 41175 1727204643.98839: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204643.98849: Calling all_plugins_play to load vars for managed-node3 41175 1727204643.98852: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204643.98855: Calling groups_plugins_play to load vars for managed-node3 41175 1727204643.99027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204643.99222: done with get_vars() 41175 1727204643.99231: done getting variables 41175 1727204643.99277: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.025) 0:00:11.131 ***** 41175 1727204643.99308: entering _queue_task() for managed-node3/set_fact 41175 1727204643.99518: worker is 1 (out of 1 available) 41175 1727204643.99533: exiting _queue_task() for managed-node3/set_fact 41175 1727204643.99546: done queuing things up, now waiting for results queue to drain 41175 1727204643.99548: waiting for pending results... 41175 1727204643.99721: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204643.99841: in run() - task 12b410aa-8751-f070-39c4-00000000051a 41175 1727204643.99853: variable 'ansible_search_path' from source: unknown 41175 1727204643.99857: variable 'ansible_search_path' from source: unknown 41175 1727204643.99887: calling self._execute() 41175 1727204643.99963: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204643.99970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204643.99980: variable 'omit' from source: magic vars 41175 1727204644.00301: variable 'ansible_distribution_major_version' from source: facts 41175 1727204644.00313: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204644.00462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204644.00762: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204644.00802: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204644.00832: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204644.00861: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204644.00939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204644.00961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204644.00984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204644.01012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204644.01085: variable '__network_is_ostree' from source: set_fact 41175 1727204644.01089: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204644.01099: when evaluation is False, skipping this task 41175 1727204644.01104: _execute() done 41175 1727204644.01107: dumping result to json 41175 1727204644.01109: done dumping result, returning 41175 1727204644.01196: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-f070-39c4-00000000051a] 41175 1727204644.01199: sending task result for task 12b410aa-8751-f070-39c4-00000000051a 41175 1727204644.01273: done sending task result for task 12b410aa-8751-f070-39c4-00000000051a 41175 1727204644.01276: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204644.01325: no more pending results, returning what we have 41175 1727204644.01329: results queue empty 41175 1727204644.01330: checking for any_errors_fatal 41175 1727204644.01336: done checking for any_errors_fatal 41175 1727204644.01337: checking for max_fail_percentage 41175 1727204644.01339: done checking for max_fail_percentage 41175 1727204644.01340: checking to see if all hosts have failed and the running result is not ok 41175 1727204644.01341: done checking to see if all hosts have failed 41175 1727204644.01342: getting the remaining hosts for this loop 41175 1727204644.01343: done getting the remaining hosts for this loop 41175 1727204644.01347: getting the next task for host managed-node3 41175 1727204644.01356: done getting next task for host managed-node3 41175 1727204644.01360: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204644.01365: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204644.01379: getting variables 41175 1727204644.01381: in VariableManager get_vars() 41175 1727204644.01420: Calling all_inventory to load vars for managed-node3 41175 1727204644.01422: Calling groups_inventory to load vars for managed-node3 41175 1727204644.01424: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204644.01432: Calling all_plugins_play to load vars for managed-node3 41175 1727204644.01434: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204644.01436: Calling groups_plugins_play to load vars for managed-node3 41175 1727204644.01636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204644.01829: done with get_vars() 41175 1727204644.01839: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.026) 0:00:11.158 ***** 41175 1727204644.01913: entering _queue_task() for managed-node3/service_facts 41175 1727204644.01915: Creating lock for service_facts 41175 1727204644.02143: worker is 1 (out of 1 available) 41175 1727204644.02158: exiting _queue_task() for managed-node3/service_facts 41175 1727204644.02171: done queuing things up, now waiting for results queue to drain 41175 1727204644.02172: waiting for pending results... 41175 1727204644.02344: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204644.02451: in run() - task 12b410aa-8751-f070-39c4-00000000051c 41175 1727204644.02462: variable 'ansible_search_path' from source: unknown 41175 1727204644.02466: variable 'ansible_search_path' from source: unknown 41175 1727204644.02499: calling self._execute() 41175 1727204644.02575: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204644.02582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204644.02592: variable 'omit' from source: magic vars 41175 1727204644.02912: variable 'ansible_distribution_major_version' from source: facts 41175 1727204644.02925: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204644.02931: variable 'omit' from source: magic vars 41175 1727204644.02997: variable 'omit' from source: magic vars 41175 1727204644.03026: variable 'omit' from source: magic vars 41175 1727204644.03060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204644.03099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204644.03117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204644.03136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204644.03150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204644.03184: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204644.03187: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204644.03271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204644.03277: Set connection var ansible_shell_executable to /bin/sh 41175 1727204644.03287: Set connection var ansible_shell_type to sh 41175 1727204644.03291: Set connection var ansible_pipelining to False 41175 1727204644.03303: Set connection var ansible_timeout to 10 41175 1727204644.03309: Set connection var ansible_connection to ssh 41175 1727204644.03315: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204644.03338: variable 'ansible_shell_executable' from source: unknown 41175 1727204644.03342: variable 'ansible_connection' from source: unknown 41175 1727204644.03345: variable 'ansible_module_compression' from source: unknown 41175 1727204644.03347: variable 'ansible_shell_type' from source: unknown 41175 1727204644.03352: variable 'ansible_shell_executable' from source: unknown 41175 1727204644.03356: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204644.03361: variable 'ansible_pipelining' from source: unknown 41175 1727204644.03363: variable 'ansible_timeout' from source: unknown 41175 1727204644.03369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204644.03539: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204644.03550: variable 'omit' from source: magic vars 41175 1727204644.03555: starting attempt loop 41175 1727204644.03559: running the handler 41175 1727204644.03572: _low_level_execute_command(): starting 41175 1727204644.03580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204644.04133: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204644.04137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204644.04141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204644.04143: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204644.04192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204644.04195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204644.04243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204644.05987: stdout chunk (state=3): >>>/root <<< 41175 1727204644.06086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204644.06148: stderr chunk (state=3): >>><<< 41175 1727204644.06152: stdout chunk (state=3): >>><<< 41175 1727204644.06174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204644.06186: _low_level_execute_command(): starting 41175 1727204644.06194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548 `" && echo ansible-tmp-1727204644.061737-41828-169755318952548="` echo /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548 `" ) && sleep 0' 41175 1727204644.06671: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204644.06675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204644.06680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204644.06691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204644.06694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204644.06744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204644.06752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204644.06754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204644.06791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204644.08773: stdout chunk (state=3): >>>ansible-tmp-1727204644.061737-41828-169755318952548=/root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548 <<< 41175 1727204644.08887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204644.08941: stderr chunk (state=3): >>><<< 41175 1727204644.08945: stdout chunk (state=3): >>><<< 41175 1727204644.08965: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204644.061737-41828-169755318952548=/root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204644.09010: variable 'ansible_module_compression' from source: unknown 41175 1727204644.09051: ANSIBALLZ: Using lock for service_facts 41175 1727204644.09056: ANSIBALLZ: Acquiring lock 41175 1727204644.09059: ANSIBALLZ: Lock acquired: 140088837722032 41175 1727204644.09061: ANSIBALLZ: Creating module 41175 1727204644.23897: ANSIBALLZ: Writing module into payload 41175 1727204644.23953: ANSIBALLZ: Writing module 41175 1727204644.23985: ANSIBALLZ: Renaming module 41175 1727204644.24002: ANSIBALLZ: Done creating module 41175 1727204644.24026: variable 'ansible_facts' from source: unknown 41175 1727204644.24116: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/AnsiballZ_service_facts.py 41175 1727204644.24376: Sending initial data 41175 1727204644.24386: Sent initial data (161 bytes) 41175 1727204644.25006: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204644.25039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204644.25056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204644.25078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204644.25158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204644.26882: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204644.26943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204644.26977: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/AnsiballZ_service_facts.py" <<< 41175 1727204644.27005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpszdyc324 /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/AnsiballZ_service_facts.py <<< 41175 1727204644.27038: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpszdyc324" to remote "/root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/AnsiballZ_service_facts.py" <<< 41175 1727204644.28182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204644.28267: stderr chunk (state=3): >>><<< 41175 1727204644.28387: stdout chunk (state=3): >>><<< 41175 1727204644.28392: done transferring module to remote 41175 1727204644.28395: _low_level_execute_command(): starting 41175 1727204644.28398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/ /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/AnsiballZ_service_facts.py && sleep 0' 41175 1727204644.29027: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204644.29064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204644.29076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204644.29184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204644.29209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204644.29228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204644.29249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204644.29318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204644.31284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204644.31303: stdout chunk (state=3): >>><<< 41175 1727204644.31319: stderr chunk (state=3): >>><<< 41175 1727204644.31340: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204644.31350: _low_level_execute_command(): starting 41175 1727204644.31360: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/AnsiballZ_service_facts.py && sleep 0' 41175 1727204644.31994: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204644.32011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204644.32027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204644.32052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204644.32163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204644.32179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204644.32201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204644.32275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204646.35233: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "<<< 41175 1727204646.35264: stdout chunk (state=3): >>>source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "syst<<< 41175 1727204646.35270: stdout chunk (state=3): >>>emd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "system<<< 41175 1727204646.35306: stdout chunk (state=3): >>>d"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "s<<< 41175 1727204646.35314: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41175 1727204646.36996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204646.37058: stderr chunk (state=3): >>><<< 41175 1727204646.37062: stdout chunk (state=3): >>><<< 41175 1727204646.37085: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204646.37704: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204646.37715: _low_level_execute_command(): starting 41175 1727204646.37723: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204644.061737-41828-169755318952548/ > /dev/null 2>&1 && sleep 0' 41175 1727204646.38219: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204646.38223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204646.38226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204646.38229: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204646.38231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.38285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204646.38292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204646.38295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204646.38332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204646.40306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204646.40360: stderr chunk (state=3): >>><<< 41175 1727204646.40364: stdout chunk (state=3): >>><<< 41175 1727204646.40377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204646.40388: handler run complete 41175 1727204646.40561: variable 'ansible_facts' from source: unknown 41175 1727204646.40700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204646.41143: variable 'ansible_facts' from source: unknown 41175 1727204646.41261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204646.41461: attempt loop complete, returning result 41175 1727204646.41467: _execute() done 41175 1727204646.41471: dumping result to json 41175 1727204646.41523: done dumping result, returning 41175 1727204646.41533: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-f070-39c4-00000000051c] 41175 1727204646.41538: sending task result for task 12b410aa-8751-f070-39c4-00000000051c ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204646.43031: no more pending results, returning what we have 41175 1727204646.43035: results queue empty 41175 1727204646.43036: checking for any_errors_fatal 41175 1727204646.43041: done checking for any_errors_fatal 41175 1727204646.43042: checking for max_fail_percentage 41175 1727204646.43043: done checking for max_fail_percentage 41175 1727204646.43044: checking to see if all hosts have failed and the running result is not ok 41175 1727204646.43045: done checking to see if all hosts have failed 41175 1727204646.43046: getting the remaining hosts for this loop 41175 1727204646.43047: done getting the remaining hosts for this loop 41175 1727204646.43052: getting the next task for host managed-node3 41175 1727204646.43058: done getting next task for host managed-node3 41175 1727204646.43062: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204646.43066: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204646.43078: done sending task result for task 12b410aa-8751-f070-39c4-00000000051c 41175 1727204646.43082: WORKER PROCESS EXITING 41175 1727204646.43088: getting variables 41175 1727204646.43089: in VariableManager get_vars() 41175 1727204646.43120: Calling all_inventory to load vars for managed-node3 41175 1727204646.43122: Calling groups_inventory to load vars for managed-node3 41175 1727204646.43124: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204646.43132: Calling all_plugins_play to load vars for managed-node3 41175 1727204646.43134: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204646.43136: Calling groups_plugins_play to load vars for managed-node3 41175 1727204646.43484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204646.43949: done with get_vars() 41175 1727204646.43962: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:06 -0400 (0:00:02.421) 0:00:13.579 ***** 41175 1727204646.44048: entering _queue_task() for managed-node3/package_facts 41175 1727204646.44050: Creating lock for package_facts 41175 1727204646.44286: worker is 1 (out of 1 available) 41175 1727204646.44302: exiting _queue_task() for managed-node3/package_facts 41175 1727204646.44314: done queuing things up, now waiting for results queue to drain 41175 1727204646.44316: waiting for pending results... 41175 1727204646.44503: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204646.44614: in run() - task 12b410aa-8751-f070-39c4-00000000051d 41175 1727204646.44627: variable 'ansible_search_path' from source: unknown 41175 1727204646.44630: variable 'ansible_search_path' from source: unknown 41175 1727204646.44666: calling self._execute() 41175 1727204646.44740: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204646.44748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204646.44760: variable 'omit' from source: magic vars 41175 1727204646.45071: variable 'ansible_distribution_major_version' from source: facts 41175 1727204646.45082: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204646.45091: variable 'omit' from source: magic vars 41175 1727204646.45156: variable 'omit' from source: magic vars 41175 1727204646.45186: variable 'omit' from source: magic vars 41175 1727204646.45225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204646.45255: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204646.45273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204646.45290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204646.45301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204646.45333: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204646.45337: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204646.45339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204646.45427: Set connection var ansible_shell_executable to /bin/sh 41175 1727204646.45433: Set connection var ansible_shell_type to sh 41175 1727204646.45436: Set connection var ansible_pipelining to False 41175 1727204646.45444: Set connection var ansible_timeout to 10 41175 1727204646.45451: Set connection var ansible_connection to ssh 41175 1727204646.45458: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204646.45477: variable 'ansible_shell_executable' from source: unknown 41175 1727204646.45480: variable 'ansible_connection' from source: unknown 41175 1727204646.45483: variable 'ansible_module_compression' from source: unknown 41175 1727204646.45488: variable 'ansible_shell_type' from source: unknown 41175 1727204646.45493: variable 'ansible_shell_executable' from source: unknown 41175 1727204646.45496: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204646.45502: variable 'ansible_pipelining' from source: unknown 41175 1727204646.45505: variable 'ansible_timeout' from source: unknown 41175 1727204646.45510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204646.45679: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204646.45690: variable 'omit' from source: magic vars 41175 1727204646.45697: starting attempt loop 41175 1727204646.45700: running the handler 41175 1727204646.45714: _low_level_execute_command(): starting 41175 1727204646.45722: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204646.46266: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204646.46269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.46272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204646.46275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.46331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204646.46335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204646.46382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204646.48149: stdout chunk (state=3): >>>/root <<< 41175 1727204646.48261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204646.48317: stderr chunk (state=3): >>><<< 41175 1727204646.48322: stdout chunk (state=3): >>><<< 41175 1727204646.48344: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204646.48355: _low_level_execute_command(): starting 41175 1727204646.48361: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562 `" && echo ansible-tmp-1727204646.483422-41880-164978061628562="` echo /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562 `" ) && sleep 0' 41175 1727204646.48816: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204646.48823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.48826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204646.48835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.48881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204646.48886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204646.48929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204646.50943: stdout chunk (state=3): >>>ansible-tmp-1727204646.483422-41880-164978061628562=/root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562 <<< 41175 1727204646.51060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204646.51113: stderr chunk (state=3): >>><<< 41175 1727204646.51119: stdout chunk (state=3): >>><<< 41175 1727204646.51133: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204646.483422-41880-164978061628562=/root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204646.51173: variable 'ansible_module_compression' from source: unknown 41175 1727204646.51214: ANSIBALLZ: Using lock for package_facts 41175 1727204646.51221: ANSIBALLZ: Acquiring lock 41175 1727204646.51224: ANSIBALLZ: Lock acquired: 140088839174816 41175 1727204646.51226: ANSIBALLZ: Creating module 41175 1727204646.77897: ANSIBALLZ: Writing module into payload 41175 1727204646.77939: ANSIBALLZ: Writing module 41175 1727204646.77979: ANSIBALLZ: Renaming module 41175 1727204646.77996: ANSIBALLZ: Done creating module 41175 1727204646.78040: variable 'ansible_facts' from source: unknown 41175 1727204646.78254: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/AnsiballZ_package_facts.py 41175 1727204646.78405: Sending initial data 41175 1727204646.78418: Sent initial data (161 bytes) 41175 1727204646.78873: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204646.78879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204646.78917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204646.78921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204646.78924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204646.78927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.78983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204646.78986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204646.79037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204646.80959: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204646.80986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204646.81041: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpw2a8e59q /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/AnsiballZ_package_facts.py <<< 41175 1727204646.81045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/AnsiballZ_package_facts.py" <<< 41175 1727204646.81064: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpw2a8e59q" to remote "/root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/AnsiballZ_package_facts.py" <<< 41175 1727204646.83351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204646.83366: stderr chunk (state=3): >>><<< 41175 1727204646.83381: stdout chunk (state=3): >>><<< 41175 1727204646.83525: done transferring module to remote 41175 1727204646.83545: _low_level_execute_command(): starting 41175 1727204646.83557: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/ /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/AnsiballZ_package_facts.py && sleep 0' 41175 1727204646.84313: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.84358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204646.84376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204646.84400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204646.84471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204646.86453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204646.86465: stdout chunk (state=3): >>><<< 41175 1727204646.86478: stderr chunk (state=3): >>><<< 41175 1727204646.86503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204646.86597: _low_level_execute_command(): starting 41175 1727204646.86600: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/AnsiballZ_package_facts.py && sleep 0' 41175 1727204646.87260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204646.87398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204646.87402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204646.87425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204646.87445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204646.87469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204646.87734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204647.51729: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 41175 1727204647.51794: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 41175 1727204647.51832: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 41175 1727204647.51902: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 41175 1727204647.51934: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 41175 1727204647.51983: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 41175 1727204647.51992: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 41175 1727204647.52020: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41175 1727204647.53968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204647.54085: stderr chunk (state=3): >>><<< 41175 1727204647.54088: stdout chunk (state=3): >>><<< 41175 1727204647.54111: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204647.59098: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204647.59102: _low_level_execute_command(): starting 41175 1727204647.59105: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204646.483422-41880-164978061628562/ > /dev/null 2>&1 && sleep 0' 41175 1727204647.59672: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204647.59724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204647.59727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204647.59730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204647.59733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204647.59735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204647.59737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204647.59813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204647.59821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204647.59824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204647.59874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204647.61965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204647.61969: stdout chunk (state=3): >>><<< 41175 1727204647.61971: stderr chunk (state=3): >>><<< 41175 1727204647.61975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204647.61977: handler run complete 41175 1727204647.67901: variable 'ansible_facts' from source: unknown 41175 1727204647.69098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204647.71296: variable 'ansible_facts' from source: unknown 41175 1727204647.72067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204647.73434: attempt loop complete, returning result 41175 1727204647.73459: _execute() done 41175 1727204647.73463: dumping result to json 41175 1727204647.73711: done dumping result, returning 41175 1727204647.73723: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-f070-39c4-00000000051d] 41175 1727204647.73726: sending task result for task 12b410aa-8751-f070-39c4-00000000051d 41175 1727204647.76116: done sending task result for task 12b410aa-8751-f070-39c4-00000000051d 41175 1727204647.76123: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204647.76226: no more pending results, returning what we have 41175 1727204647.76230: results queue empty 41175 1727204647.76231: checking for any_errors_fatal 41175 1727204647.76236: done checking for any_errors_fatal 41175 1727204647.76237: checking for max_fail_percentage 41175 1727204647.76239: done checking for max_fail_percentage 41175 1727204647.76240: checking to see if all hosts have failed and the running result is not ok 41175 1727204647.76241: done checking to see if all hosts have failed 41175 1727204647.76242: getting the remaining hosts for this loop 41175 1727204647.76244: done getting the remaining hosts for this loop 41175 1727204647.76248: getting the next task for host managed-node3 41175 1727204647.76256: done getting next task for host managed-node3 41175 1727204647.76259: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204647.76262: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204647.76273: getting variables 41175 1727204647.76275: in VariableManager get_vars() 41175 1727204647.76316: Calling all_inventory to load vars for managed-node3 41175 1727204647.76322: Calling groups_inventory to load vars for managed-node3 41175 1727204647.76325: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204647.76336: Calling all_plugins_play to load vars for managed-node3 41175 1727204647.76340: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204647.76344: Calling groups_plugins_play to load vars for managed-node3 41175 1727204647.77711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204647.79312: done with get_vars() 41175 1727204647.79338: done getting variables 41175 1727204647.79391: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:07 -0400 (0:00:01.353) 0:00:14.933 ***** 41175 1727204647.79424: entering _queue_task() for managed-node3/debug 41175 1727204647.79676: worker is 1 (out of 1 available) 41175 1727204647.79694: exiting _queue_task() for managed-node3/debug 41175 1727204647.79707: done queuing things up, now waiting for results queue to drain 41175 1727204647.79709: waiting for pending results... 41175 1727204647.79898: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204647.80002: in run() - task 12b410aa-8751-f070-39c4-000000000017 41175 1727204647.80015: variable 'ansible_search_path' from source: unknown 41175 1727204647.80022: variable 'ansible_search_path' from source: unknown 41175 1727204647.80058: calling self._execute() 41175 1727204647.80130: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204647.80137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204647.80148: variable 'omit' from source: magic vars 41175 1727204647.80473: variable 'ansible_distribution_major_version' from source: facts 41175 1727204647.80485: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204647.80495: variable 'omit' from source: magic vars 41175 1727204647.80543: variable 'omit' from source: magic vars 41175 1727204647.80628: variable 'network_provider' from source: set_fact 41175 1727204647.80643: variable 'omit' from source: magic vars 41175 1727204647.80677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204647.80712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204647.80731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204647.80747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204647.80759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204647.80786: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204647.80791: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204647.80796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204647.80882: Set connection var ansible_shell_executable to /bin/sh 41175 1727204647.80886: Set connection var ansible_shell_type to sh 41175 1727204647.80892: Set connection var ansible_pipelining to False 41175 1727204647.80902: Set connection var ansible_timeout to 10 41175 1727204647.80910: Set connection var ansible_connection to ssh 41175 1727204647.80915: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204647.80939: variable 'ansible_shell_executable' from source: unknown 41175 1727204647.80943: variable 'ansible_connection' from source: unknown 41175 1727204647.80946: variable 'ansible_module_compression' from source: unknown 41175 1727204647.80948: variable 'ansible_shell_type' from source: unknown 41175 1727204647.80951: variable 'ansible_shell_executable' from source: unknown 41175 1727204647.80957: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204647.80959: variable 'ansible_pipelining' from source: unknown 41175 1727204647.80964: variable 'ansible_timeout' from source: unknown 41175 1727204647.80970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204647.81088: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204647.81104: variable 'omit' from source: magic vars 41175 1727204647.81110: starting attempt loop 41175 1727204647.81113: running the handler 41175 1727204647.81167: handler run complete 41175 1727204647.81182: attempt loop complete, returning result 41175 1727204647.81186: _execute() done 41175 1727204647.81188: dumping result to json 41175 1727204647.81195: done dumping result, returning 41175 1727204647.81203: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-f070-39c4-000000000017] 41175 1727204647.81209: sending task result for task 12b410aa-8751-f070-39c4-000000000017 41175 1727204647.81301: done sending task result for task 12b410aa-8751-f070-39c4-000000000017 41175 1727204647.81304: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 41175 1727204647.81365: no more pending results, returning what we have 41175 1727204647.81369: results queue empty 41175 1727204647.81370: checking for any_errors_fatal 41175 1727204647.81381: done checking for any_errors_fatal 41175 1727204647.81382: checking for max_fail_percentage 41175 1727204647.81383: done checking for max_fail_percentage 41175 1727204647.81384: checking to see if all hosts have failed and the running result is not ok 41175 1727204647.81386: done checking to see if all hosts have failed 41175 1727204647.81386: getting the remaining hosts for this loop 41175 1727204647.81388: done getting the remaining hosts for this loop 41175 1727204647.81401: getting the next task for host managed-node3 41175 1727204647.81408: done getting next task for host managed-node3 41175 1727204647.81413: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204647.81416: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204647.81430: getting variables 41175 1727204647.81431: in VariableManager get_vars() 41175 1727204647.81470: Calling all_inventory to load vars for managed-node3 41175 1727204647.81473: Calling groups_inventory to load vars for managed-node3 41175 1727204647.81476: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204647.81485: Calling all_plugins_play to load vars for managed-node3 41175 1727204647.81488: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204647.81494: Calling groups_plugins_play to load vars for managed-node3 41175 1727204647.82777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204647.84377: done with get_vars() 41175 1727204647.84406: done getting variables 41175 1727204647.84461: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:07 -0400 (0:00:00.050) 0:00:14.983 ***** 41175 1727204647.84491: entering _queue_task() for managed-node3/fail 41175 1727204647.84750: worker is 1 (out of 1 available) 41175 1727204647.84766: exiting _queue_task() for managed-node3/fail 41175 1727204647.84778: done queuing things up, now waiting for results queue to drain 41175 1727204647.84781: waiting for pending results... 41175 1727204647.84969: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204647.85085: in run() - task 12b410aa-8751-f070-39c4-000000000018 41175 1727204647.85099: variable 'ansible_search_path' from source: unknown 41175 1727204647.85103: variable 'ansible_search_path' from source: unknown 41175 1727204647.85139: calling self._execute() 41175 1727204647.85213: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204647.85225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204647.85233: variable 'omit' from source: magic vars 41175 1727204647.85539: variable 'ansible_distribution_major_version' from source: facts 41175 1727204647.85552: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204647.85658: variable 'network_state' from source: role '' defaults 41175 1727204647.85667: Evaluated conditional (network_state != {}): False 41175 1727204647.85672: when evaluation is False, skipping this task 41175 1727204647.85675: _execute() done 41175 1727204647.85678: dumping result to json 41175 1727204647.85686: done dumping result, returning 41175 1727204647.85693: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-f070-39c4-000000000018] 41175 1727204647.85700: sending task result for task 12b410aa-8751-f070-39c4-000000000018 41175 1727204647.85801: done sending task result for task 12b410aa-8751-f070-39c4-000000000018 41175 1727204647.85804: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204647.85857: no more pending results, returning what we have 41175 1727204647.85862: results queue empty 41175 1727204647.85864: checking for any_errors_fatal 41175 1727204647.85870: done checking for any_errors_fatal 41175 1727204647.85871: checking for max_fail_percentage 41175 1727204647.85872: done checking for max_fail_percentage 41175 1727204647.85873: checking to see if all hosts have failed and the running result is not ok 41175 1727204647.85874: done checking to see if all hosts have failed 41175 1727204647.85875: getting the remaining hosts for this loop 41175 1727204647.85877: done getting the remaining hosts for this loop 41175 1727204647.85882: getting the next task for host managed-node3 41175 1727204647.85891: done getting next task for host managed-node3 41175 1727204647.85896: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204647.85899: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204647.85914: getting variables 41175 1727204647.85916: in VariableManager get_vars() 41175 1727204647.85957: Calling all_inventory to load vars for managed-node3 41175 1727204647.85961: Calling groups_inventory to load vars for managed-node3 41175 1727204647.85963: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204647.85974: Calling all_plugins_play to load vars for managed-node3 41175 1727204647.85977: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204647.85980: Calling groups_plugins_play to load vars for managed-node3 41175 1727204647.87279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204647.88876: done with get_vars() 41175 1727204647.88901: done getting variables 41175 1727204647.88955: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:07 -0400 (0:00:00.044) 0:00:15.028 ***** 41175 1727204647.88981: entering _queue_task() for managed-node3/fail 41175 1727204647.89230: worker is 1 (out of 1 available) 41175 1727204647.89244: exiting _queue_task() for managed-node3/fail 41175 1727204647.89257: done queuing things up, now waiting for results queue to drain 41175 1727204647.89259: waiting for pending results... 41175 1727204647.89446: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204647.89553: in run() - task 12b410aa-8751-f070-39c4-000000000019 41175 1727204647.89566: variable 'ansible_search_path' from source: unknown 41175 1727204647.89570: variable 'ansible_search_path' from source: unknown 41175 1727204647.89607: calling self._execute() 41175 1727204647.89682: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204647.89688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204647.89701: variable 'omit' from source: magic vars 41175 1727204647.90020: variable 'ansible_distribution_major_version' from source: facts 41175 1727204647.90031: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204647.90135: variable 'network_state' from source: role '' defaults 41175 1727204647.90149: Evaluated conditional (network_state != {}): False 41175 1727204647.90154: when evaluation is False, skipping this task 41175 1727204647.90157: _execute() done 41175 1727204647.90160: dumping result to json 41175 1727204647.90163: done dumping result, returning 41175 1727204647.90171: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-f070-39c4-000000000019] 41175 1727204647.90177: sending task result for task 12b410aa-8751-f070-39c4-000000000019 41175 1727204647.90275: done sending task result for task 12b410aa-8751-f070-39c4-000000000019 41175 1727204647.90279: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204647.90333: no more pending results, returning what we have 41175 1727204647.90337: results queue empty 41175 1727204647.90339: checking for any_errors_fatal 41175 1727204647.90351: done checking for any_errors_fatal 41175 1727204647.90352: checking for max_fail_percentage 41175 1727204647.90354: done checking for max_fail_percentage 41175 1727204647.90355: checking to see if all hosts have failed and the running result is not ok 41175 1727204647.90356: done checking to see if all hosts have failed 41175 1727204647.90357: getting the remaining hosts for this loop 41175 1727204647.90359: done getting the remaining hosts for this loop 41175 1727204647.90364: getting the next task for host managed-node3 41175 1727204647.90371: done getting next task for host managed-node3 41175 1727204647.90374: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204647.90377: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204647.90395: getting variables 41175 1727204647.90397: in VariableManager get_vars() 41175 1727204647.90438: Calling all_inventory to load vars for managed-node3 41175 1727204647.90441: Calling groups_inventory to load vars for managed-node3 41175 1727204647.90443: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204647.90454: Calling all_plugins_play to load vars for managed-node3 41175 1727204647.90457: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204647.90461: Calling groups_plugins_play to load vars for managed-node3 41175 1727204647.91675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204647.93274: done with get_vars() 41175 1727204647.93305: done getting variables 41175 1727204647.93357: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:07 -0400 (0:00:00.044) 0:00:15.072 ***** 41175 1727204647.93385: entering _queue_task() for managed-node3/fail 41175 1727204647.93644: worker is 1 (out of 1 available) 41175 1727204647.93659: exiting _queue_task() for managed-node3/fail 41175 1727204647.93673: done queuing things up, now waiting for results queue to drain 41175 1727204647.93675: waiting for pending results... 41175 1727204647.93868: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204647.93983: in run() - task 12b410aa-8751-f070-39c4-00000000001a 41175 1727204647.93999: variable 'ansible_search_path' from source: unknown 41175 1727204647.94003: variable 'ansible_search_path' from source: unknown 41175 1727204647.94040: calling self._execute() 41175 1727204647.94121: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204647.94125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204647.94139: variable 'omit' from source: magic vars 41175 1727204647.94443: variable 'ansible_distribution_major_version' from source: facts 41175 1727204647.94457: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204647.94613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204647.96600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204647.96654: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204647.96697: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204647.96729: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204647.96753: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204647.96825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204647.96849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204647.96873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204647.96911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204647.96926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204647.97004: variable 'ansible_distribution_major_version' from source: facts 41175 1727204647.97018: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41175 1727204647.97116: variable 'ansible_distribution' from source: facts 41175 1727204647.97123: variable '__network_rh_distros' from source: role '' defaults 41175 1727204647.97132: Evaluated conditional (ansible_distribution in __network_rh_distros): False 41175 1727204647.97135: when evaluation is False, skipping this task 41175 1727204647.97139: _execute() done 41175 1727204647.97144: dumping result to json 41175 1727204647.97147: done dumping result, returning 41175 1727204647.97156: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-f070-39c4-00000000001a] 41175 1727204647.97162: sending task result for task 12b410aa-8751-f070-39c4-00000000001a 41175 1727204647.97257: done sending task result for task 12b410aa-8751-f070-39c4-00000000001a 41175 1727204647.97260: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 41175 1727204647.97337: no more pending results, returning what we have 41175 1727204647.97341: results queue empty 41175 1727204647.97342: checking for any_errors_fatal 41175 1727204647.97348: done checking for any_errors_fatal 41175 1727204647.97349: checking for max_fail_percentage 41175 1727204647.97351: done checking for max_fail_percentage 41175 1727204647.97352: checking to see if all hosts have failed and the running result is not ok 41175 1727204647.97353: done checking to see if all hosts have failed 41175 1727204647.97354: getting the remaining hosts for this loop 41175 1727204647.97356: done getting the remaining hosts for this loop 41175 1727204647.97361: getting the next task for host managed-node3 41175 1727204647.97368: done getting next task for host managed-node3 41175 1727204647.97380: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204647.97383: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204647.97399: getting variables 41175 1727204647.97401: in VariableManager get_vars() 41175 1727204647.97442: Calling all_inventory to load vars for managed-node3 41175 1727204647.97445: Calling groups_inventory to load vars for managed-node3 41175 1727204647.97447: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204647.97457: Calling all_plugins_play to load vars for managed-node3 41175 1727204647.97460: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204647.97463: Calling groups_plugins_play to load vars for managed-node3 41175 1727204647.98822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.01037: done with get_vars() 41175 1727204648.01063: done getting variables 41175 1727204648.01151: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.077) 0:00:15.150 ***** 41175 1727204648.01179: entering _queue_task() for managed-node3/dnf 41175 1727204648.01427: worker is 1 (out of 1 available) 41175 1727204648.01443: exiting _queue_task() for managed-node3/dnf 41175 1727204648.01455: done queuing things up, now waiting for results queue to drain 41175 1727204648.01457: waiting for pending results... 41175 1727204648.01648: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204648.01753: in run() - task 12b410aa-8751-f070-39c4-00000000001b 41175 1727204648.01766: variable 'ansible_search_path' from source: unknown 41175 1727204648.01770: variable 'ansible_search_path' from source: unknown 41175 1727204648.01808: calling self._execute() 41175 1727204648.01878: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.01886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.01898: variable 'omit' from source: magic vars 41175 1727204648.02212: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.02224: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.02401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204648.04138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204648.04205: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204648.04240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204648.04269: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204648.04298: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204648.04367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.04392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.04415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.04451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.04464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.04562: variable 'ansible_distribution' from source: facts 41175 1727204648.04566: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.04573: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41175 1727204648.04669: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204648.04782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.04805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.04826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.04865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.04874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.04910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.04931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.04950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.04985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.05000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.05034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.05053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.05082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.05110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.05124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.05255: variable 'network_connections' from source: task vars 41175 1727204648.05266: variable 'interface' from source: set_fact 41175 1727204648.05328: variable 'interface' from source: set_fact 41175 1727204648.05337: variable 'interface' from source: set_fact 41175 1727204648.05388: variable 'interface' from source: set_fact 41175 1727204648.05454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204648.05597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204648.05633: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204648.05659: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204648.05684: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204648.05729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204648.05746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204648.05770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.05793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204648.05848: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204648.06057: variable 'network_connections' from source: task vars 41175 1727204648.06062: variable 'interface' from source: set_fact 41175 1727204648.06145: variable 'interface' from source: set_fact 41175 1727204648.06149: variable 'interface' from source: set_fact 41175 1727204648.06280: variable 'interface' from source: set_fact 41175 1727204648.06283: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204648.06286: when evaluation is False, skipping this task 41175 1727204648.06291: _execute() done 41175 1727204648.06294: dumping result to json 41175 1727204648.06296: done dumping result, returning 41175 1727204648.06299: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-00000000001b] 41175 1727204648.06301: sending task result for task 12b410aa-8751-f070-39c4-00000000001b 41175 1727204648.06373: done sending task result for task 12b410aa-8751-f070-39c4-00000000001b 41175 1727204648.06376: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204648.06646: no more pending results, returning what we have 41175 1727204648.06649: results queue empty 41175 1727204648.06651: checking for any_errors_fatal 41175 1727204648.06658: done checking for any_errors_fatal 41175 1727204648.06659: checking for max_fail_percentage 41175 1727204648.06660: done checking for max_fail_percentage 41175 1727204648.06661: checking to see if all hosts have failed and the running result is not ok 41175 1727204648.06662: done checking to see if all hosts have failed 41175 1727204648.06663: getting the remaining hosts for this loop 41175 1727204648.06665: done getting the remaining hosts for this loop 41175 1727204648.06669: getting the next task for host managed-node3 41175 1727204648.06675: done getting next task for host managed-node3 41175 1727204648.06679: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204648.06682: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204648.06701: getting variables 41175 1727204648.06703: in VariableManager get_vars() 41175 1727204648.06745: Calling all_inventory to load vars for managed-node3 41175 1727204648.06753: Calling groups_inventory to load vars for managed-node3 41175 1727204648.06757: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204648.06768: Calling all_plugins_play to load vars for managed-node3 41175 1727204648.06771: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204648.06775: Calling groups_plugins_play to load vars for managed-node3 41175 1727204648.13367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.16494: done with get_vars() 41175 1727204648.16532: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204648.16611: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.154) 0:00:15.305 ***** 41175 1727204648.16644: entering _queue_task() for managed-node3/yum 41175 1727204648.16646: Creating lock for yum 41175 1727204648.16988: worker is 1 (out of 1 available) 41175 1727204648.17203: exiting _queue_task() for managed-node3/yum 41175 1727204648.17214: done queuing things up, now waiting for results queue to drain 41175 1727204648.17216: waiting for pending results... 41175 1727204648.17510: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204648.17519: in run() - task 12b410aa-8751-f070-39c4-00000000001c 41175 1727204648.17521: variable 'ansible_search_path' from source: unknown 41175 1727204648.17525: variable 'ansible_search_path' from source: unknown 41175 1727204648.17558: calling self._execute() 41175 1727204648.17664: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.17678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.17695: variable 'omit' from source: magic vars 41175 1727204648.18143: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.18163: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.18396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204648.21045: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204648.21144: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204648.21225: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204648.21250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204648.21288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204648.21392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.21441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.21553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.21556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.21559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.21687: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.21714: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41175 1727204648.21723: when evaluation is False, skipping this task 41175 1727204648.21732: _execute() done 41175 1727204648.21740: dumping result to json 41175 1727204648.21749: done dumping result, returning 41175 1727204648.21767: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-00000000001c] 41175 1727204648.21778: sending task result for task 12b410aa-8751-f070-39c4-00000000001c skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41175 1727204648.21961: no more pending results, returning what we have 41175 1727204648.21966: results queue empty 41175 1727204648.21967: checking for any_errors_fatal 41175 1727204648.21976: done checking for any_errors_fatal 41175 1727204648.21977: checking for max_fail_percentage 41175 1727204648.21979: done checking for max_fail_percentage 41175 1727204648.21980: checking to see if all hosts have failed and the running result is not ok 41175 1727204648.21982: done checking to see if all hosts have failed 41175 1727204648.21983: getting the remaining hosts for this loop 41175 1727204648.21985: done getting the remaining hosts for this loop 41175 1727204648.21992: getting the next task for host managed-node3 41175 1727204648.22001: done getting next task for host managed-node3 41175 1727204648.22005: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204648.22009: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204648.22027: getting variables 41175 1727204648.22030: in VariableManager get_vars() 41175 1727204648.22079: Calling all_inventory to load vars for managed-node3 41175 1727204648.22082: Calling groups_inventory to load vars for managed-node3 41175 1727204648.22085: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204648.22302: Calling all_plugins_play to load vars for managed-node3 41175 1727204648.22306: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204648.22311: Calling groups_plugins_play to load vars for managed-node3 41175 1727204648.23006: done sending task result for task 12b410aa-8751-f070-39c4-00000000001c 41175 1727204648.23010: WORKER PROCESS EXITING 41175 1727204648.24633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.27666: done with get_vars() 41175 1727204648.27704: done getting variables 41175 1727204648.27775: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.111) 0:00:15.417 ***** 41175 1727204648.27816: entering _queue_task() for managed-node3/fail 41175 1727204648.28158: worker is 1 (out of 1 available) 41175 1727204648.28172: exiting _queue_task() for managed-node3/fail 41175 1727204648.28184: done queuing things up, now waiting for results queue to drain 41175 1727204648.28186: waiting for pending results... 41175 1727204648.28501: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204648.28669: in run() - task 12b410aa-8751-f070-39c4-00000000001d 41175 1727204648.28698: variable 'ansible_search_path' from source: unknown 41175 1727204648.28709: variable 'ansible_search_path' from source: unknown 41175 1727204648.28762: calling self._execute() 41175 1727204648.28880: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.28899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.28916: variable 'omit' from source: magic vars 41175 1727204648.29381: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.29406: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.29570: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204648.29843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204648.32923: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204648.33013: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204648.33082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204648.33140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204648.33240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204648.33282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.33324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.33364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.33418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.33443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.33508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.33541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.33580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.33633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.33674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.33714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.33746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.33995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.33999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.34002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.34060: variable 'network_connections' from source: task vars 41175 1727204648.34080: variable 'interface' from source: set_fact 41175 1727204648.34186: variable 'interface' from source: set_fact 41175 1727204648.34203: variable 'interface' from source: set_fact 41175 1727204648.34281: variable 'interface' from source: set_fact 41175 1727204648.34379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204648.34592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204648.34644: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204648.34697: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204648.34754: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204648.34817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204648.34849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204648.34894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.34934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204648.35020: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204648.35358: variable 'network_connections' from source: task vars 41175 1727204648.35372: variable 'interface' from source: set_fact 41175 1727204648.35463: variable 'interface' from source: set_fact 41175 1727204648.35477: variable 'interface' from source: set_fact 41175 1727204648.35564: variable 'interface' from source: set_fact 41175 1727204648.35622: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204648.35632: when evaluation is False, skipping this task 41175 1727204648.35646: _execute() done 41175 1727204648.35654: dumping result to json 41175 1727204648.35663: done dumping result, returning 41175 1727204648.35678: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-00000000001d] 41175 1727204648.35753: sending task result for task 12b410aa-8751-f070-39c4-00000000001d 41175 1727204648.35838: done sending task result for task 12b410aa-8751-f070-39c4-00000000001d 41175 1727204648.35841: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204648.35922: no more pending results, returning what we have 41175 1727204648.35927: results queue empty 41175 1727204648.35928: checking for any_errors_fatal 41175 1727204648.35936: done checking for any_errors_fatal 41175 1727204648.35937: checking for max_fail_percentage 41175 1727204648.35939: done checking for max_fail_percentage 41175 1727204648.35940: checking to see if all hosts have failed and the running result is not ok 41175 1727204648.35941: done checking to see if all hosts have failed 41175 1727204648.35942: getting the remaining hosts for this loop 41175 1727204648.35944: done getting the remaining hosts for this loop 41175 1727204648.35951: getting the next task for host managed-node3 41175 1727204648.35960: done getting next task for host managed-node3 41175 1727204648.35965: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41175 1727204648.35969: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204648.35986: getting variables 41175 1727204648.35988: in VariableManager get_vars() 41175 1727204648.36048: Calling all_inventory to load vars for managed-node3 41175 1727204648.36052: Calling groups_inventory to load vars for managed-node3 41175 1727204648.36055: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204648.36069: Calling all_plugins_play to load vars for managed-node3 41175 1727204648.36073: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204648.36077: Calling groups_plugins_play to load vars for managed-node3 41175 1727204648.38771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.41764: done with get_vars() 41175 1727204648.41802: done getting variables 41175 1727204648.41871: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.140) 0:00:15.558 ***** 41175 1727204648.41913: entering _queue_task() for managed-node3/package 41175 1727204648.42255: worker is 1 (out of 1 available) 41175 1727204648.42269: exiting _queue_task() for managed-node3/package 41175 1727204648.42281: done queuing things up, now waiting for results queue to drain 41175 1727204648.42283: waiting for pending results... 41175 1727204648.42711: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41175 1727204648.42759: in run() - task 12b410aa-8751-f070-39c4-00000000001e 41175 1727204648.42782: variable 'ansible_search_path' from source: unknown 41175 1727204648.42792: variable 'ansible_search_path' from source: unknown 41175 1727204648.42841: calling self._execute() 41175 1727204648.42948: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.42962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.42979: variable 'omit' from source: magic vars 41175 1727204648.43432: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.43695: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.43711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204648.44030: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204648.44087: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204648.44137: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204648.44215: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204648.44370: variable 'network_packages' from source: role '' defaults 41175 1727204648.44515: variable '__network_provider_setup' from source: role '' defaults 41175 1727204648.44532: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204648.44626: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204648.44641: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204648.44725: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204648.44994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204648.47432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204648.47507: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204648.47551: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204648.47592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204648.47630: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204648.47744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.47784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.47829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.47896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.47921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.47994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.48030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.48071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.48130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.48154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.48596: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204648.48619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.48656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.48694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.48756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.48779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.48899: variable 'ansible_python' from source: facts 41175 1727204648.48940: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204648.49047: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204648.49155: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204648.49328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.49395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.49407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.49463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.49492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.49556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.49687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.49690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.49696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.49720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.49914: variable 'network_connections' from source: task vars 41175 1727204648.49928: variable 'interface' from source: set_fact 41175 1727204648.50058: variable 'interface' from source: set_fact 41175 1727204648.50075: variable 'interface' from source: set_fact 41175 1727204648.50206: variable 'interface' from source: set_fact 41175 1727204648.50313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204648.50357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204648.50403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.50449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204648.50515: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204648.51004: variable 'network_connections' from source: task vars 41175 1727204648.51007: variable 'interface' from source: set_fact 41175 1727204648.51040: variable 'interface' from source: set_fact 41175 1727204648.51056: variable 'interface' from source: set_fact 41175 1727204648.51183: variable 'interface' from source: set_fact 41175 1727204648.51266: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204648.51373: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204648.51667: variable 'network_connections' from source: task vars 41175 1727204648.51673: variable 'interface' from source: set_fact 41175 1727204648.51729: variable 'interface' from source: set_fact 41175 1727204648.51735: variable 'interface' from source: set_fact 41175 1727204648.51793: variable 'interface' from source: set_fact 41175 1727204648.51822: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204648.51886: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204648.52140: variable 'network_connections' from source: task vars 41175 1727204648.52144: variable 'interface' from source: set_fact 41175 1727204648.52202: variable 'interface' from source: set_fact 41175 1727204648.52206: variable 'interface' from source: set_fact 41175 1727204648.52261: variable 'interface' from source: set_fact 41175 1727204648.52324: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204648.52373: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204648.52380: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204648.52435: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204648.52620: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204648.53170: variable 'network_connections' from source: task vars 41175 1727204648.53173: variable 'interface' from source: set_fact 41175 1727204648.53235: variable 'interface' from source: set_fact 41175 1727204648.53239: variable 'interface' from source: set_fact 41175 1727204648.53303: variable 'interface' from source: set_fact 41175 1727204648.53342: variable 'ansible_distribution' from source: facts 41175 1727204648.53346: variable '__network_rh_distros' from source: role '' defaults 41175 1727204648.53349: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.53444: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204648.53646: variable 'ansible_distribution' from source: facts 41175 1727204648.53650: variable '__network_rh_distros' from source: role '' defaults 41175 1727204648.53652: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.53654: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204648.53901: variable 'ansible_distribution' from source: facts 41175 1727204648.53905: variable '__network_rh_distros' from source: role '' defaults 41175 1727204648.53908: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.53910: variable 'network_provider' from source: set_fact 41175 1727204648.53912: variable 'ansible_facts' from source: unknown 41175 1727204648.54808: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41175 1727204648.54812: when evaluation is False, skipping this task 41175 1727204648.54815: _execute() done 41175 1727204648.54820: dumping result to json 41175 1727204648.54823: done dumping result, returning 41175 1727204648.54831: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-f070-39c4-00000000001e] 41175 1727204648.54836: sending task result for task 12b410aa-8751-f070-39c4-00000000001e skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41175 1727204648.54998: no more pending results, returning what we have 41175 1727204648.55003: results queue empty 41175 1727204648.55003: checking for any_errors_fatal 41175 1727204648.55014: done checking for any_errors_fatal 41175 1727204648.55015: checking for max_fail_percentage 41175 1727204648.55019: done checking for max_fail_percentage 41175 1727204648.55020: checking to see if all hosts have failed and the running result is not ok 41175 1727204648.55021: done checking to see if all hosts have failed 41175 1727204648.55022: getting the remaining hosts for this loop 41175 1727204648.55024: done getting the remaining hosts for this loop 41175 1727204648.55029: getting the next task for host managed-node3 41175 1727204648.55037: done getting next task for host managed-node3 41175 1727204648.55042: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204648.55045: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204648.55061: getting variables 41175 1727204648.55063: in VariableManager get_vars() 41175 1727204648.55125: Calling all_inventory to load vars for managed-node3 41175 1727204648.55132: done sending task result for task 12b410aa-8751-f070-39c4-00000000001e 41175 1727204648.55142: WORKER PROCESS EXITING 41175 1727204648.55138: Calling groups_inventory to load vars for managed-node3 41175 1727204648.55146: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204648.55157: Calling all_plugins_play to load vars for managed-node3 41175 1727204648.55160: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204648.55164: Calling groups_plugins_play to load vars for managed-node3 41175 1727204648.56528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.58850: done with get_vars() 41175 1727204648.58874: done getting variables 41175 1727204648.58926: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.170) 0:00:15.728 ***** 41175 1727204648.58954: entering _queue_task() for managed-node3/package 41175 1727204648.59203: worker is 1 (out of 1 available) 41175 1727204648.59217: exiting _queue_task() for managed-node3/package 41175 1727204648.59230: done queuing things up, now waiting for results queue to drain 41175 1727204648.59233: waiting for pending results... 41175 1727204648.59427: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204648.59534: in run() - task 12b410aa-8751-f070-39c4-00000000001f 41175 1727204648.59549: variable 'ansible_search_path' from source: unknown 41175 1727204648.59553: variable 'ansible_search_path' from source: unknown 41175 1727204648.59590: calling self._execute() 41175 1727204648.59671: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.59683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.59690: variable 'omit' from source: magic vars 41175 1727204648.60194: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.60197: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.60280: variable 'network_state' from source: role '' defaults 41175 1727204648.60301: Evaluated conditional (network_state != {}): False 41175 1727204648.60310: when evaluation is False, skipping this task 41175 1727204648.60320: _execute() done 41175 1727204648.60329: dumping result to json 41175 1727204648.60338: done dumping result, returning 41175 1727204648.60364: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-f070-39c4-00000000001f] 41175 1727204648.60376: sending task result for task 12b410aa-8751-f070-39c4-00000000001f skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204648.60565: no more pending results, returning what we have 41175 1727204648.60569: results queue empty 41175 1727204648.60571: checking for any_errors_fatal 41175 1727204648.60580: done checking for any_errors_fatal 41175 1727204648.60581: checking for max_fail_percentage 41175 1727204648.60583: done checking for max_fail_percentage 41175 1727204648.60584: checking to see if all hosts have failed and the running result is not ok 41175 1727204648.60585: done checking to see if all hosts have failed 41175 1727204648.60586: getting the remaining hosts for this loop 41175 1727204648.60587: done getting the remaining hosts for this loop 41175 1727204648.60594: getting the next task for host managed-node3 41175 1727204648.60603: done getting next task for host managed-node3 41175 1727204648.60606: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204648.60610: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204648.60631: getting variables 41175 1727204648.60634: in VariableManager get_vars() 41175 1727204648.60677: Calling all_inventory to load vars for managed-node3 41175 1727204648.60681: Calling groups_inventory to load vars for managed-node3 41175 1727204648.60683: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204648.60738: Calling all_plugins_play to load vars for managed-node3 41175 1727204648.60742: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204648.60748: Calling groups_plugins_play to load vars for managed-node3 41175 1727204648.61273: done sending task result for task 12b410aa-8751-f070-39c4-00000000001f 41175 1727204648.61281: WORKER PROCESS EXITING 41175 1727204648.62719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.64310: done with get_vars() 41175 1727204648.64334: done getting variables 41175 1727204648.64384: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.054) 0:00:15.783 ***** 41175 1727204648.64413: entering _queue_task() for managed-node3/package 41175 1727204648.64735: worker is 1 (out of 1 available) 41175 1727204648.64752: exiting _queue_task() for managed-node3/package 41175 1727204648.64767: done queuing things up, now waiting for results queue to drain 41175 1727204648.64769: waiting for pending results... 41175 1727204648.65211: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204648.65254: in run() - task 12b410aa-8751-f070-39c4-000000000020 41175 1727204648.65279: variable 'ansible_search_path' from source: unknown 41175 1727204648.65292: variable 'ansible_search_path' from source: unknown 41175 1727204648.65354: calling self._execute() 41175 1727204648.65479: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.65500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.65518: variable 'omit' from source: magic vars 41175 1727204648.65861: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.65873: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.65983: variable 'network_state' from source: role '' defaults 41175 1727204648.65993: Evaluated conditional (network_state != {}): False 41175 1727204648.65997: when evaluation is False, skipping this task 41175 1727204648.66001: _execute() done 41175 1727204648.66005: dumping result to json 41175 1727204648.66009: done dumping result, returning 41175 1727204648.66018: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-f070-39c4-000000000020] 41175 1727204648.66028: sending task result for task 12b410aa-8751-f070-39c4-000000000020 41175 1727204648.66129: done sending task result for task 12b410aa-8751-f070-39c4-000000000020 41175 1727204648.66132: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204648.66185: no more pending results, returning what we have 41175 1727204648.66191: results queue empty 41175 1727204648.66193: checking for any_errors_fatal 41175 1727204648.66202: done checking for any_errors_fatal 41175 1727204648.66203: checking for max_fail_percentage 41175 1727204648.66205: done checking for max_fail_percentage 41175 1727204648.66205: checking to see if all hosts have failed and the running result is not ok 41175 1727204648.66206: done checking to see if all hosts have failed 41175 1727204648.66207: getting the remaining hosts for this loop 41175 1727204648.66209: done getting the remaining hosts for this loop 41175 1727204648.66213: getting the next task for host managed-node3 41175 1727204648.66220: done getting next task for host managed-node3 41175 1727204648.66224: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204648.66227: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204648.66242: getting variables 41175 1727204648.66244: in VariableManager get_vars() 41175 1727204648.66282: Calling all_inventory to load vars for managed-node3 41175 1727204648.66285: Calling groups_inventory to load vars for managed-node3 41175 1727204648.66288: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204648.66306: Calling all_plugins_play to load vars for managed-node3 41175 1727204648.66309: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204648.66313: Calling groups_plugins_play to load vars for managed-node3 41175 1727204648.67643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.70007: done with get_vars() 41175 1727204648.70046: done getting variables 41175 1727204648.70167: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.057) 0:00:15.840 ***** 41175 1727204648.70209: entering _queue_task() for managed-node3/service 41175 1727204648.70212: Creating lock for service 41175 1727204648.70574: worker is 1 (out of 1 available) 41175 1727204648.70592: exiting _queue_task() for managed-node3/service 41175 1727204648.70606: done queuing things up, now waiting for results queue to drain 41175 1727204648.70608: waiting for pending results... 41175 1727204648.71015: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204648.71107: in run() - task 12b410aa-8751-f070-39c4-000000000021 41175 1727204648.71139: variable 'ansible_search_path' from source: unknown 41175 1727204648.71153: variable 'ansible_search_path' from source: unknown 41175 1727204648.71297: calling self._execute() 41175 1727204648.71338: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.71355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.71374: variable 'omit' from source: magic vars 41175 1727204648.71876: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.71901: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.72071: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204648.72360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204648.74384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204648.74448: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204648.74480: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204648.74519: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204648.74541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204648.74611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.74638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.74659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.74692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.74709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.74750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.74771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.74792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.74828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.74841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.74876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.74899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.74923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.74956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.74968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.75107: variable 'network_connections' from source: task vars 41175 1727204648.75121: variable 'interface' from source: set_fact 41175 1727204648.75181: variable 'interface' from source: set_fact 41175 1727204648.75191: variable 'interface' from source: set_fact 41175 1727204648.75242: variable 'interface' from source: set_fact 41175 1727204648.75309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204648.75450: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204648.75484: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204648.75514: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204648.75540: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204648.75577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204648.75600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204648.75623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.75644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204648.75699: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204648.75897: variable 'network_connections' from source: task vars 41175 1727204648.75902: variable 'interface' from source: set_fact 41175 1727204648.75958: variable 'interface' from source: set_fact 41175 1727204648.75965: variable 'interface' from source: set_fact 41175 1727204648.76020: variable 'interface' from source: set_fact 41175 1727204648.76053: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204648.76057: when evaluation is False, skipping this task 41175 1727204648.76060: _execute() done 41175 1727204648.76063: dumping result to json 41175 1727204648.76068: done dumping result, returning 41175 1727204648.76076: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-000000000021] 41175 1727204648.76086: sending task result for task 12b410aa-8751-f070-39c4-000000000021 41175 1727204648.76181: done sending task result for task 12b410aa-8751-f070-39c4-000000000021 41175 1727204648.76184: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204648.76240: no more pending results, returning what we have 41175 1727204648.76243: results queue empty 41175 1727204648.76245: checking for any_errors_fatal 41175 1727204648.76250: done checking for any_errors_fatal 41175 1727204648.76251: checking for max_fail_percentage 41175 1727204648.76253: done checking for max_fail_percentage 41175 1727204648.76254: checking to see if all hosts have failed and the running result is not ok 41175 1727204648.76255: done checking to see if all hosts have failed 41175 1727204648.76256: getting the remaining hosts for this loop 41175 1727204648.76257: done getting the remaining hosts for this loop 41175 1727204648.76263: getting the next task for host managed-node3 41175 1727204648.76270: done getting next task for host managed-node3 41175 1727204648.76275: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204648.76278: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204648.76299: getting variables 41175 1727204648.76305: in VariableManager get_vars() 41175 1727204648.76351: Calling all_inventory to load vars for managed-node3 41175 1727204648.76354: Calling groups_inventory to load vars for managed-node3 41175 1727204648.76356: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204648.76366: Calling all_plugins_play to load vars for managed-node3 41175 1727204648.76369: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204648.76373: Calling groups_plugins_play to load vars for managed-node3 41175 1727204648.77746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204648.79340: done with get_vars() 41175 1727204648.79363: done getting variables 41175 1727204648.79420: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.092) 0:00:15.933 ***** 41175 1727204648.79445: entering _queue_task() for managed-node3/service 41175 1727204648.79704: worker is 1 (out of 1 available) 41175 1727204648.79722: exiting _queue_task() for managed-node3/service 41175 1727204648.79734: done queuing things up, now waiting for results queue to drain 41175 1727204648.79736: waiting for pending results... 41175 1727204648.79941: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204648.80055: in run() - task 12b410aa-8751-f070-39c4-000000000022 41175 1727204648.80070: variable 'ansible_search_path' from source: unknown 41175 1727204648.80073: variable 'ansible_search_path' from source: unknown 41175 1727204648.80111: calling self._execute() 41175 1727204648.80190: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.80202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.80209: variable 'omit' from source: magic vars 41175 1727204648.80537: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.80549: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204648.80692: variable 'network_provider' from source: set_fact 41175 1727204648.80696: variable 'network_state' from source: role '' defaults 41175 1727204648.80707: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41175 1727204648.80714: variable 'omit' from source: magic vars 41175 1727204648.80766: variable 'omit' from source: magic vars 41175 1727204648.80793: variable 'network_service_name' from source: role '' defaults 41175 1727204648.80858: variable 'network_service_name' from source: role '' defaults 41175 1727204648.80951: variable '__network_provider_setup' from source: role '' defaults 41175 1727204648.80957: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204648.81011: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204648.81019: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204648.81072: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204648.81274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204648.82963: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204648.83025: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204648.83077: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204648.83111: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204648.83136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204648.83210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.83238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.83262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.83302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.83315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.83363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.83385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.83407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.83440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.83454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.83647: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204648.83744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.83764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.83786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.83823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.83835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.83910: variable 'ansible_python' from source: facts 41175 1727204648.83931: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204648.83998: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204648.84064: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204648.84173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.84193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.84235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.84251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.84264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.84306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204648.84332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204648.84357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.84388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204648.84401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204648.84519: variable 'network_connections' from source: task vars 41175 1727204648.84523: variable 'interface' from source: set_fact 41175 1727204648.84694: variable 'interface' from source: set_fact 41175 1727204648.84698: variable 'interface' from source: set_fact 41175 1727204648.84724: variable 'interface' from source: set_fact 41175 1727204648.84919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204648.85149: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204648.85228: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204648.85282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204648.85336: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204648.85417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204648.85458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204648.85504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204648.85569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204648.85639: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204648.85948: variable 'network_connections' from source: task vars 41175 1727204648.85958: variable 'interface' from source: set_fact 41175 1727204648.86096: variable 'interface' from source: set_fact 41175 1727204648.86100: variable 'interface' from source: set_fact 41175 1727204648.86103: variable 'interface' from source: set_fact 41175 1727204648.86161: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204648.86234: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204648.86480: variable 'network_connections' from source: task vars 41175 1727204648.86484: variable 'interface' from source: set_fact 41175 1727204648.86549: variable 'interface' from source: set_fact 41175 1727204648.86556: variable 'interface' from source: set_fact 41175 1727204648.86614: variable 'interface' from source: set_fact 41175 1727204648.86648: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204648.86711: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204648.86968: variable 'network_connections' from source: task vars 41175 1727204648.86972: variable 'interface' from source: set_fact 41175 1727204648.87034: variable 'interface' from source: set_fact 41175 1727204648.87041: variable 'interface' from source: set_fact 41175 1727204648.87101: variable 'interface' from source: set_fact 41175 1727204648.87157: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204648.87211: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204648.87220: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204648.87268: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204648.87453: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204648.87994: variable 'network_connections' from source: task vars 41175 1727204648.87998: variable 'interface' from source: set_fact 41175 1727204648.88010: variable 'interface' from source: set_fact 41175 1727204648.88023: variable 'interface' from source: set_fact 41175 1727204648.88098: variable 'interface' from source: set_fact 41175 1727204648.88123: variable 'ansible_distribution' from source: facts 41175 1727204648.88133: variable '__network_rh_distros' from source: role '' defaults 41175 1727204648.88145: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.88175: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204648.88402: variable 'ansible_distribution' from source: facts 41175 1727204648.88411: variable '__network_rh_distros' from source: role '' defaults 41175 1727204648.88422: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.88435: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204648.88646: variable 'ansible_distribution' from source: facts 41175 1727204648.88656: variable '__network_rh_distros' from source: role '' defaults 41175 1727204648.88665: variable 'ansible_distribution_major_version' from source: facts 41175 1727204648.88716: variable 'network_provider' from source: set_fact 41175 1727204648.88742: variable 'omit' from source: magic vars 41175 1727204648.88781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204648.88817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204648.88844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204648.88870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204648.88887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204648.88994: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204648.88998: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.89000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.89064: Set connection var ansible_shell_executable to /bin/sh 41175 1727204648.89073: Set connection var ansible_shell_type to sh 41175 1727204648.89084: Set connection var ansible_pipelining to False 41175 1727204648.89101: Set connection var ansible_timeout to 10 41175 1727204648.89111: Set connection var ansible_connection to ssh 41175 1727204648.89122: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204648.89156: variable 'ansible_shell_executable' from source: unknown 41175 1727204648.89164: variable 'ansible_connection' from source: unknown 41175 1727204648.89173: variable 'ansible_module_compression' from source: unknown 41175 1727204648.89180: variable 'ansible_shell_type' from source: unknown 41175 1727204648.89188: variable 'ansible_shell_executable' from source: unknown 41175 1727204648.89197: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204648.89212: variable 'ansible_pipelining' from source: unknown 41175 1727204648.89394: variable 'ansible_timeout' from source: unknown 41175 1727204648.89397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204648.89401: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204648.89403: variable 'omit' from source: magic vars 41175 1727204648.89406: starting attempt loop 41175 1727204648.89408: running the handler 41175 1727204648.89483: variable 'ansible_facts' from source: unknown 41175 1727204648.90664: _low_level_execute_command(): starting 41175 1727204648.90677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204648.91388: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204648.91406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204648.91422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204648.91441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204648.91459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204648.91472: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204648.91485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204648.91587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204648.91614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204648.91699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204648.93445: stdout chunk (state=3): >>>/root <<< 41175 1727204648.93622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204648.93644: stdout chunk (state=3): >>><<< 41175 1727204648.93655: stderr chunk (state=3): >>><<< 41175 1727204648.93678: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204648.93697: _low_level_execute_command(): starting 41175 1727204648.93707: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666 `" && echo ansible-tmp-1727204648.936838-41951-9365508139666="` echo /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666 `" ) && sleep 0' 41175 1727204648.94368: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204648.94381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204648.94403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204648.94424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204648.94446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204648.94570: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204648.94575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204648.94600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204648.94684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204648.96700: stdout chunk (state=3): >>>ansible-tmp-1727204648.936838-41951-9365508139666=/root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666 <<< 41175 1727204648.96884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204648.96904: stderr chunk (state=3): >>><<< 41175 1727204648.96914: stdout chunk (state=3): >>><<< 41175 1727204648.96940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204648.936838-41951-9365508139666=/root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204648.96995: variable 'ansible_module_compression' from source: unknown 41175 1727204648.97057: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 41175 1727204648.97094: ANSIBALLZ: Acquiring lock 41175 1727204648.97097: ANSIBALLZ: Lock acquired: 140088839296144 41175 1727204648.97104: ANSIBALLZ: Creating module 41175 1727204649.35631: ANSIBALLZ: Writing module into payload 41175 1727204649.35768: ANSIBALLZ: Writing module 41175 1727204649.35799: ANSIBALLZ: Renaming module 41175 1727204649.35806: ANSIBALLZ: Done creating module 41175 1727204649.35827: variable 'ansible_facts' from source: unknown 41175 1727204649.35939: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/AnsiballZ_systemd.py 41175 1727204649.36064: Sending initial data 41175 1727204649.36067: Sent initial data (153 bytes) 41175 1727204649.36533: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204649.36539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.36541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204649.36544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.36633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204649.36636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204649.36677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204649.38420: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204649.38460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204649.38487: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmppoijonkw /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/AnsiballZ_systemd.py <<< 41175 1727204649.38502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/AnsiballZ_systemd.py" <<< 41175 1727204649.38524: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmppoijonkw" to remote "/root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/AnsiballZ_systemd.py" <<< 41175 1727204649.40396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204649.40435: stderr chunk (state=3): >>><<< 41175 1727204649.40445: stdout chunk (state=3): >>><<< 41175 1727204649.40479: done transferring module to remote 41175 1727204649.40500: _low_level_execute_command(): starting 41175 1727204649.40512: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/ /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/AnsiballZ_systemd.py && sleep 0' 41175 1727204649.41225: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204649.41228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204649.41237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204649.41256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204649.41260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.41284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204649.41288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.41340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204649.41347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204649.41382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204649.43289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204649.43337: stderr chunk (state=3): >>><<< 41175 1727204649.43340: stdout chunk (state=3): >>><<< 41175 1727204649.43355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204649.43358: _low_level_execute_command(): starting 41175 1727204649.43364: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/AnsiballZ_systemd.py && sleep 0' 41175 1727204649.43791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204649.43822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204649.43831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.43834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204649.43836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.43886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204649.43893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204649.43936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204649.77144: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11894784", "MemoryAvailable": "infinity", "CPUUsageNSec": "1885404000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41175 1727204649.79197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204649.79201: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 41175 1727204649.79203: stdout chunk (state=3): >>><<< 41175 1727204649.79206: stderr chunk (state=3): >>><<< 41175 1727204649.79209: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11894784", "MemoryAvailable": "infinity", "CPUUsageNSec": "1885404000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204649.79516: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204649.79525: _low_level_execute_command(): starting 41175 1727204649.79532: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204648.936838-41951-9365508139666/ > /dev/null 2>&1 && sleep 0' 41175 1727204649.80295: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204649.80301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204649.80322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204649.80326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.80345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204649.80352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204649.80392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204649.80396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204649.80477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204649.80498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204649.80520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204649.80597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204649.82596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204649.82600: stderr chunk (state=3): >>><<< 41175 1727204649.82608: stdout chunk (state=3): >>><<< 41175 1727204649.82630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204649.82640: handler run complete 41175 1727204649.82730: attempt loop complete, returning result 41175 1727204649.82733: _execute() done 41175 1727204649.82736: dumping result to json 41175 1727204649.82760: done dumping result, returning 41175 1727204649.82773: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-f070-39c4-000000000022] 41175 1727204649.82778: sending task result for task 12b410aa-8751-f070-39c4-000000000022 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204649.83250: no more pending results, returning what we have 41175 1727204649.83254: results queue empty 41175 1727204649.83256: checking for any_errors_fatal 41175 1727204649.83264: done checking for any_errors_fatal 41175 1727204649.83265: checking for max_fail_percentage 41175 1727204649.83267: done checking for max_fail_percentage 41175 1727204649.83268: checking to see if all hosts have failed and the running result is not ok 41175 1727204649.83269: done checking to see if all hosts have failed 41175 1727204649.83270: getting the remaining hosts for this loop 41175 1727204649.83272: done getting the remaining hosts for this loop 41175 1727204649.83277: getting the next task for host managed-node3 41175 1727204649.83285: done getting next task for host managed-node3 41175 1727204649.83295: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204649.83298: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204649.83311: getting variables 41175 1727204649.83314: in VariableManager get_vars() 41175 1727204649.83358: Calling all_inventory to load vars for managed-node3 41175 1727204649.83361: Calling groups_inventory to load vars for managed-node3 41175 1727204649.83364: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204649.83377: Calling all_plugins_play to load vars for managed-node3 41175 1727204649.83381: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204649.83385: Calling groups_plugins_play to load vars for managed-node3 41175 1727204649.84133: done sending task result for task 12b410aa-8751-f070-39c4-000000000022 41175 1727204649.84137: WORKER PROCESS EXITING 41175 1727204649.86042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204649.89358: done with get_vars() 41175 1727204649.89401: done getting variables 41175 1727204649.89473: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:09 -0400 (0:00:01.100) 0:00:17.034 ***** 41175 1727204649.89522: entering _queue_task() for managed-node3/service 41175 1727204649.89899: worker is 1 (out of 1 available) 41175 1727204649.90027: exiting _queue_task() for managed-node3/service 41175 1727204649.90039: done queuing things up, now waiting for results queue to drain 41175 1727204649.90041: waiting for pending results... 41175 1727204649.90266: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204649.90446: in run() - task 12b410aa-8751-f070-39c4-000000000023 41175 1727204649.90473: variable 'ansible_search_path' from source: unknown 41175 1727204649.90482: variable 'ansible_search_path' from source: unknown 41175 1727204649.90534: calling self._execute() 41175 1727204649.90649: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204649.90663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204649.90684: variable 'omit' from source: magic vars 41175 1727204649.91165: variable 'ansible_distribution_major_version' from source: facts 41175 1727204649.91184: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204649.91358: variable 'network_provider' from source: set_fact 41175 1727204649.91371: Evaluated conditional (network_provider == "nm"): True 41175 1727204649.91503: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204649.91631: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204649.91881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204649.94697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204649.94730: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204649.94783: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204649.94846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204649.94883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204649.95005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204649.95059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204649.95096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204649.95168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204649.95196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204649.95296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204649.95312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204649.95351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204649.95421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204649.95478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204649.95506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204649.95527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204649.95551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204649.95583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204649.95598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204649.95730: variable 'network_connections' from source: task vars 41175 1727204649.95742: variable 'interface' from source: set_fact 41175 1727204649.95807: variable 'interface' from source: set_fact 41175 1727204649.95820: variable 'interface' from source: set_fact 41175 1727204649.95871: variable 'interface' from source: set_fact 41175 1727204649.95942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204649.96078: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204649.96113: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204649.96145: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204649.96168: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204649.96211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204649.96230: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204649.96253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204649.96275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204649.96322: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204649.96537: variable 'network_connections' from source: task vars 41175 1727204649.96543: variable 'interface' from source: set_fact 41175 1727204649.96597: variable 'interface' from source: set_fact 41175 1727204649.96604: variable 'interface' from source: set_fact 41175 1727204649.96658: variable 'interface' from source: set_fact 41175 1727204649.96701: Evaluated conditional (__network_wpa_supplicant_required): False 41175 1727204649.96705: when evaluation is False, skipping this task 41175 1727204649.96707: _execute() done 41175 1727204649.96721: dumping result to json 41175 1727204649.96724: done dumping result, returning 41175 1727204649.96727: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-f070-39c4-000000000023] 41175 1727204649.96730: sending task result for task 12b410aa-8751-f070-39c4-000000000023 41175 1727204649.96832: done sending task result for task 12b410aa-8751-f070-39c4-000000000023 41175 1727204649.96835: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41175 1727204649.96886: no more pending results, returning what we have 41175 1727204649.96892: results queue empty 41175 1727204649.96893: checking for any_errors_fatal 41175 1727204649.96921: done checking for any_errors_fatal 41175 1727204649.96922: checking for max_fail_percentage 41175 1727204649.96924: done checking for max_fail_percentage 41175 1727204649.96925: checking to see if all hosts have failed and the running result is not ok 41175 1727204649.96926: done checking to see if all hosts have failed 41175 1727204649.96927: getting the remaining hosts for this loop 41175 1727204649.96929: done getting the remaining hosts for this loop 41175 1727204649.96933: getting the next task for host managed-node3 41175 1727204649.96942: done getting next task for host managed-node3 41175 1727204649.96946: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204649.96949: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204649.96964: getting variables 41175 1727204649.96966: in VariableManager get_vars() 41175 1727204649.97019: Calling all_inventory to load vars for managed-node3 41175 1727204649.97023: Calling groups_inventory to load vars for managed-node3 41175 1727204649.97025: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204649.97036: Calling all_plugins_play to load vars for managed-node3 41175 1727204649.97039: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204649.97042: Calling groups_plugins_play to load vars for managed-node3 41175 1727204649.98863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204650.00464: done with get_vars() 41175 1727204650.00494: done getting variables 41175 1727204650.00552: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.110) 0:00:17.144 ***** 41175 1727204650.00597: entering _queue_task() for managed-node3/service 41175 1727204650.00948: worker is 1 (out of 1 available) 41175 1727204650.00963: exiting _queue_task() for managed-node3/service 41175 1727204650.00978: done queuing things up, now waiting for results queue to drain 41175 1727204650.00980: waiting for pending results... 41175 1727204650.01413: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204650.01475: in run() - task 12b410aa-8751-f070-39c4-000000000024 41175 1727204650.01503: variable 'ansible_search_path' from source: unknown 41175 1727204650.01516: variable 'ansible_search_path' from source: unknown 41175 1727204650.01596: calling self._execute() 41175 1727204650.01671: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204650.01686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204650.01704: variable 'omit' from source: magic vars 41175 1727204650.02174: variable 'ansible_distribution_major_version' from source: facts 41175 1727204650.02186: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204650.02292: variable 'network_provider' from source: set_fact 41175 1727204650.02299: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204650.02302: when evaluation is False, skipping this task 41175 1727204650.02307: _execute() done 41175 1727204650.02310: dumping result to json 41175 1727204650.02315: done dumping result, returning 41175 1727204650.02338: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-f070-39c4-000000000024] 41175 1727204650.02347: sending task result for task 12b410aa-8751-f070-39c4-000000000024 41175 1727204650.02443: done sending task result for task 12b410aa-8751-f070-39c4-000000000024 41175 1727204650.02447: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204650.02499: no more pending results, returning what we have 41175 1727204650.02504: results queue empty 41175 1727204650.02505: checking for any_errors_fatal 41175 1727204650.02516: done checking for any_errors_fatal 41175 1727204650.02516: checking for max_fail_percentage 41175 1727204650.02518: done checking for max_fail_percentage 41175 1727204650.02519: checking to see if all hosts have failed and the running result is not ok 41175 1727204650.02520: done checking to see if all hosts have failed 41175 1727204650.02521: getting the remaining hosts for this loop 41175 1727204650.02523: done getting the remaining hosts for this loop 41175 1727204650.02528: getting the next task for host managed-node3 41175 1727204650.02535: done getting next task for host managed-node3 41175 1727204650.02539: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204650.02542: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204650.02558: getting variables 41175 1727204650.02560: in VariableManager get_vars() 41175 1727204650.02602: Calling all_inventory to load vars for managed-node3 41175 1727204650.02605: Calling groups_inventory to load vars for managed-node3 41175 1727204650.02607: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204650.02618: Calling all_plugins_play to load vars for managed-node3 41175 1727204650.02621: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204650.02625: Calling groups_plugins_play to load vars for managed-node3 41175 1727204650.04040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204650.06634: done with get_vars() 41175 1727204650.06663: done getting variables 41175 1727204650.06718: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.061) 0:00:17.206 ***** 41175 1727204650.06747: entering _queue_task() for managed-node3/copy 41175 1727204650.07011: worker is 1 (out of 1 available) 41175 1727204650.07024: exiting _queue_task() for managed-node3/copy 41175 1727204650.07037: done queuing things up, now waiting for results queue to drain 41175 1727204650.07039: waiting for pending results... 41175 1727204650.07242: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204650.07352: in run() - task 12b410aa-8751-f070-39c4-000000000025 41175 1727204650.07366: variable 'ansible_search_path' from source: unknown 41175 1727204650.07371: variable 'ansible_search_path' from source: unknown 41175 1727204650.07407: calling self._execute() 41175 1727204650.07491: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204650.07501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204650.07509: variable 'omit' from source: magic vars 41175 1727204650.07833: variable 'ansible_distribution_major_version' from source: facts 41175 1727204650.07845: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204650.07946: variable 'network_provider' from source: set_fact 41175 1727204650.07953: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204650.07956: when evaluation is False, skipping this task 41175 1727204650.07960: _execute() done 41175 1727204650.07964: dumping result to json 41175 1727204650.07969: done dumping result, returning 41175 1727204650.07978: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-f070-39c4-000000000025] 41175 1727204650.07986: sending task result for task 12b410aa-8751-f070-39c4-000000000025 41175 1727204650.08084: done sending task result for task 12b410aa-8751-f070-39c4-000000000025 41175 1727204650.08087: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41175 1727204650.08140: no more pending results, returning what we have 41175 1727204650.08145: results queue empty 41175 1727204650.08146: checking for any_errors_fatal 41175 1727204650.08154: done checking for any_errors_fatal 41175 1727204650.08155: checking for max_fail_percentage 41175 1727204650.08156: done checking for max_fail_percentage 41175 1727204650.08158: checking to see if all hosts have failed and the running result is not ok 41175 1727204650.08159: done checking to see if all hosts have failed 41175 1727204650.08159: getting the remaining hosts for this loop 41175 1727204650.08161: done getting the remaining hosts for this loop 41175 1727204650.08166: getting the next task for host managed-node3 41175 1727204650.08174: done getting next task for host managed-node3 41175 1727204650.08179: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204650.08182: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204650.08200: getting variables 41175 1727204650.08202: in VariableManager get_vars() 41175 1727204650.08242: Calling all_inventory to load vars for managed-node3 41175 1727204650.08245: Calling groups_inventory to load vars for managed-node3 41175 1727204650.08247: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204650.08257: Calling all_plugins_play to load vars for managed-node3 41175 1727204650.08260: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204650.08263: Calling groups_plugins_play to load vars for managed-node3 41175 1727204650.09998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204650.11658: done with get_vars() 41175 1727204650.11683: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.050) 0:00:17.256 ***** 41175 1727204650.11753: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204650.11754: Creating lock for fedora.linux_system_roles.network_connections 41175 1727204650.12010: worker is 1 (out of 1 available) 41175 1727204650.12025: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204650.12037: done queuing things up, now waiting for results queue to drain 41175 1727204650.12039: waiting for pending results... 41175 1727204650.12230: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204650.12339: in run() - task 12b410aa-8751-f070-39c4-000000000026 41175 1727204650.12352: variable 'ansible_search_path' from source: unknown 41175 1727204650.12356: variable 'ansible_search_path' from source: unknown 41175 1727204650.12392: calling self._execute() 41175 1727204650.12466: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204650.12472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204650.12487: variable 'omit' from source: magic vars 41175 1727204650.12795: variable 'ansible_distribution_major_version' from source: facts 41175 1727204650.12806: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204650.12818: variable 'omit' from source: magic vars 41175 1727204650.12869: variable 'omit' from source: magic vars 41175 1727204650.13009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204650.14713: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204650.14769: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204650.14802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204650.14835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204650.14858: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204650.14931: variable 'network_provider' from source: set_fact 41175 1727204650.15039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204650.15072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204650.15095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204650.15136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204650.15149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204650.15211: variable 'omit' from source: magic vars 41175 1727204650.15306: variable 'omit' from source: magic vars 41175 1727204650.15394: variable 'network_connections' from source: task vars 41175 1727204650.15409: variable 'interface' from source: set_fact 41175 1727204650.15472: variable 'interface' from source: set_fact 41175 1727204650.15477: variable 'interface' from source: set_fact 41175 1727204650.15530: variable 'interface' from source: set_fact 41175 1727204650.15710: variable 'omit' from source: magic vars 41175 1727204650.15718: variable '__lsr_ansible_managed' from source: task vars 41175 1727204650.15771: variable '__lsr_ansible_managed' from source: task vars 41175 1727204650.16010: Loaded config def from plugin (lookup/template) 41175 1727204650.16014: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41175 1727204650.16040: File lookup term: get_ansible_managed.j2 41175 1727204650.16043: variable 'ansible_search_path' from source: unknown 41175 1727204650.16049: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41175 1727204650.16062: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41175 1727204650.16076: variable 'ansible_search_path' from source: unknown 41175 1727204650.21516: variable 'ansible_managed' from source: unknown 41175 1727204650.21648: variable 'omit' from source: magic vars 41175 1727204650.21673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204650.21699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204650.21716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204650.21736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204650.21747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204650.21772: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204650.21776: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204650.21779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204650.21865: Set connection var ansible_shell_executable to /bin/sh 41175 1727204650.21868: Set connection var ansible_shell_type to sh 41175 1727204650.21871: Set connection var ansible_pipelining to False 41175 1727204650.21882: Set connection var ansible_timeout to 10 41175 1727204650.21888: Set connection var ansible_connection to ssh 41175 1727204650.21897: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204650.21919: variable 'ansible_shell_executable' from source: unknown 41175 1727204650.21922: variable 'ansible_connection' from source: unknown 41175 1727204650.21925: variable 'ansible_module_compression' from source: unknown 41175 1727204650.21927: variable 'ansible_shell_type' from source: unknown 41175 1727204650.21931: variable 'ansible_shell_executable' from source: unknown 41175 1727204650.21933: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204650.21940: variable 'ansible_pipelining' from source: unknown 41175 1727204650.21942: variable 'ansible_timeout' from source: unknown 41175 1727204650.21962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204650.22064: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204650.22078: variable 'omit' from source: magic vars 41175 1727204650.22088: starting attempt loop 41175 1727204650.22093: running the handler 41175 1727204650.22101: _low_level_execute_command(): starting 41175 1727204650.22109: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204650.22661: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204650.22665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204650.22668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204650.22670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204650.22726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204650.22729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204650.22783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204650.24548: stdout chunk (state=3): >>>/root <<< 41175 1727204650.24654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204650.24720: stderr chunk (state=3): >>><<< 41175 1727204650.24728: stdout chunk (state=3): >>><<< 41175 1727204650.24752: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204650.24764: _low_level_execute_command(): starting 41175 1727204650.24773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624 `" && echo ansible-tmp-1727204650.2475176-41990-261309636254624="` echo /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624 `" ) && sleep 0' 41175 1727204650.25256: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204650.25260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204650.25263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204650.25265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204650.25326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204650.25329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204650.25365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204650.27357: stdout chunk (state=3): >>>ansible-tmp-1727204650.2475176-41990-261309636254624=/root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624 <<< 41175 1727204650.27478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204650.27537: stderr chunk (state=3): >>><<< 41175 1727204650.27540: stdout chunk (state=3): >>><<< 41175 1727204650.27552: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204650.2475176-41990-261309636254624=/root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204650.27596: variable 'ansible_module_compression' from source: unknown 41175 1727204650.27640: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 41175 1727204650.27644: ANSIBALLZ: Acquiring lock 41175 1727204650.27652: ANSIBALLZ: Lock acquired: 140088838680944 41175 1727204650.27655: ANSIBALLZ: Creating module 41175 1727204650.54397: ANSIBALLZ: Writing module into payload 41175 1727204650.54839: ANSIBALLZ: Writing module 41175 1727204650.54880: ANSIBALLZ: Renaming module 41175 1727204650.54956: ANSIBALLZ: Done creating module 41175 1727204650.54959: variable 'ansible_facts' from source: unknown 41175 1727204650.55062: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/AnsiballZ_network_connections.py 41175 1727204650.55329: Sending initial data 41175 1727204650.55332: Sent initial data (168 bytes) 41175 1727204650.55957: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204650.55991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204650.56106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204650.56128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204650.56145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204650.56167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204650.56243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204650.57980: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204650.58033: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204650.58100: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpp7e13s0i /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/AnsiballZ_network_connections.py <<< 41175 1727204650.58125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpp7e13s0i" to remote "/root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/AnsiballZ_network_connections.py" <<< 41175 1727204650.59844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204650.59995: stderr chunk (state=3): >>><<< 41175 1727204650.59998: stdout chunk (state=3): >>><<< 41175 1727204650.60001: done transferring module to remote 41175 1727204650.60009: _low_level_execute_command(): starting 41175 1727204650.60018: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/ /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/AnsiballZ_network_connections.py && sleep 0' 41175 1727204650.60678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204650.60707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204650.60813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204650.60838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204650.60860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204650.60873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204650.60940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204650.62932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204650.62951: stdout chunk (state=3): >>><<< 41175 1727204650.62967: stderr chunk (state=3): >>><<< 41175 1727204650.62993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204650.63004: _low_level_execute_command(): starting 41175 1727204650.63015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/AnsiballZ_network_connections.py && sleep 0' 41175 1727204650.63629: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204650.63653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204650.63670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204650.63688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204650.63709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204650.63761: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204650.63825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204650.63856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204650.63949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.05063: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41175 1727204651.07041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204651.07107: stderr chunk (state=3): >>><<< 41175 1727204651.07111: stdout chunk (state=3): >>><<< 41175 1727204651.07130: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204651.07201: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 30200, 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204651.07211: _low_level_execute_command(): starting 41175 1727204651.07220: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204650.2475176-41990-261309636254624/ > /dev/null 2>&1 && sleep 0' 41175 1727204651.07678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204651.07725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204651.07728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204651.07731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204651.07733: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204651.07735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.07780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204651.07784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204651.07834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.09793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204651.09841: stderr chunk (state=3): >>><<< 41175 1727204651.09845: stdout chunk (state=3): >>><<< 41175 1727204651.09859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204651.09869: handler run complete 41175 1727204651.09927: attempt loop complete, returning result 41175 1727204651.09930: _execute() done 41175 1727204651.09933: dumping result to json 41175 1727204651.09941: done dumping result, returning 41175 1727204651.09951: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-f070-39c4-000000000026] 41175 1727204651.09957: sending task result for task 12b410aa-8751-f070-39c4-000000000026 41175 1727204651.10089: done sending task result for task 12b410aa-8751-f070-39c4-000000000026 41175 1727204651.10093: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (not-active) 41175 1727204651.10306: no more pending results, returning what we have 41175 1727204651.10310: results queue empty 41175 1727204651.10311: checking for any_errors_fatal 41175 1727204651.10317: done checking for any_errors_fatal 41175 1727204651.10318: checking for max_fail_percentage 41175 1727204651.10320: done checking for max_fail_percentage 41175 1727204651.10321: checking to see if all hosts have failed and the running result is not ok 41175 1727204651.10322: done checking to see if all hosts have failed 41175 1727204651.10322: getting the remaining hosts for this loop 41175 1727204651.10324: done getting the remaining hosts for this loop 41175 1727204651.10329: getting the next task for host managed-node3 41175 1727204651.10335: done getting next task for host managed-node3 41175 1727204651.10339: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204651.10342: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204651.10354: getting variables 41175 1727204651.10356: in VariableManager get_vars() 41175 1727204651.10397: Calling all_inventory to load vars for managed-node3 41175 1727204651.10407: Calling groups_inventory to load vars for managed-node3 41175 1727204651.10410: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204651.10421: Calling all_plugins_play to load vars for managed-node3 41175 1727204651.10424: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204651.10428: Calling groups_plugins_play to load vars for managed-node3 41175 1727204651.11683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204651.13297: done with get_vars() 41175 1727204651.13324: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:11 -0400 (0:00:01.016) 0:00:18.272 ***** 41175 1727204651.13402: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204651.13403: Creating lock for fedora.linux_system_roles.network_state 41175 1727204651.13673: worker is 1 (out of 1 available) 41175 1727204651.13690: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204651.13703: done queuing things up, now waiting for results queue to drain 41175 1727204651.13705: waiting for pending results... 41175 1727204651.13905: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204651.14011: in run() - task 12b410aa-8751-f070-39c4-000000000027 41175 1727204651.14024: variable 'ansible_search_path' from source: unknown 41175 1727204651.14028: variable 'ansible_search_path' from source: unknown 41175 1727204651.14064: calling self._execute() 41175 1727204651.14147: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.14151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.14164: variable 'omit' from source: magic vars 41175 1727204651.14474: variable 'ansible_distribution_major_version' from source: facts 41175 1727204651.14493: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204651.14587: variable 'network_state' from source: role '' defaults 41175 1727204651.14599: Evaluated conditional (network_state != {}): False 41175 1727204651.14608: when evaluation is False, skipping this task 41175 1727204651.14611: _execute() done 41175 1727204651.14614: dumping result to json 41175 1727204651.14622: done dumping result, returning 41175 1727204651.14631: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-f070-39c4-000000000027] 41175 1727204651.14637: sending task result for task 12b410aa-8751-f070-39c4-000000000027 41175 1727204651.14735: done sending task result for task 12b410aa-8751-f070-39c4-000000000027 41175 1727204651.14739: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204651.14854: no more pending results, returning what we have 41175 1727204651.14859: results queue empty 41175 1727204651.14860: checking for any_errors_fatal 41175 1727204651.14877: done checking for any_errors_fatal 41175 1727204651.14878: checking for max_fail_percentage 41175 1727204651.14881: done checking for max_fail_percentage 41175 1727204651.14882: checking to see if all hosts have failed and the running result is not ok 41175 1727204651.14883: done checking to see if all hosts have failed 41175 1727204651.14884: getting the remaining hosts for this loop 41175 1727204651.14886: done getting the remaining hosts for this loop 41175 1727204651.14893: getting the next task for host managed-node3 41175 1727204651.14904: done getting next task for host managed-node3 41175 1727204651.14909: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204651.14913: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204651.14935: getting variables 41175 1727204651.14937: in VariableManager get_vars() 41175 1727204651.14985: Calling all_inventory to load vars for managed-node3 41175 1727204651.15165: Calling groups_inventory to load vars for managed-node3 41175 1727204651.15171: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204651.15183: Calling all_plugins_play to load vars for managed-node3 41175 1727204651.15186: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204651.15192: Calling groups_plugins_play to load vars for managed-node3 41175 1727204651.16895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204651.18888: done with get_vars() 41175 1727204651.18928: done getting variables 41175 1727204651.18993: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.056) 0:00:18.329 ***** 41175 1727204651.19034: entering _queue_task() for managed-node3/debug 41175 1727204651.19373: worker is 1 (out of 1 available) 41175 1727204651.19387: exiting _queue_task() for managed-node3/debug 41175 1727204651.19503: done queuing things up, now waiting for results queue to drain 41175 1727204651.19505: waiting for pending results... 41175 1727204651.19810: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204651.19995: in run() - task 12b410aa-8751-f070-39c4-000000000028 41175 1727204651.19999: variable 'ansible_search_path' from source: unknown 41175 1727204651.20002: variable 'ansible_search_path' from source: unknown 41175 1727204651.20006: calling self._execute() 41175 1727204651.20088: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.20105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.20129: variable 'omit' from source: magic vars 41175 1727204651.20575: variable 'ansible_distribution_major_version' from source: facts 41175 1727204651.20598: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204651.20610: variable 'omit' from source: magic vars 41175 1727204651.20697: variable 'omit' from source: magic vars 41175 1727204651.20747: variable 'omit' from source: magic vars 41175 1727204651.20805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204651.20856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204651.20889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204651.20921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204651.21094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204651.21097: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204651.21100: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.21102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.21127: Set connection var ansible_shell_executable to /bin/sh 41175 1727204651.21135: Set connection var ansible_shell_type to sh 41175 1727204651.21147: Set connection var ansible_pipelining to False 41175 1727204651.21163: Set connection var ansible_timeout to 10 41175 1727204651.21174: Set connection var ansible_connection to ssh 41175 1727204651.21185: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204651.21225: variable 'ansible_shell_executable' from source: unknown 41175 1727204651.21234: variable 'ansible_connection' from source: unknown 41175 1727204651.21243: variable 'ansible_module_compression' from source: unknown 41175 1727204651.21251: variable 'ansible_shell_type' from source: unknown 41175 1727204651.21258: variable 'ansible_shell_executable' from source: unknown 41175 1727204651.21265: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.21274: variable 'ansible_pipelining' from source: unknown 41175 1727204651.21281: variable 'ansible_timeout' from source: unknown 41175 1727204651.21291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.21464: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204651.21484: variable 'omit' from source: magic vars 41175 1727204651.21497: starting attempt loop 41175 1727204651.21505: running the handler 41175 1727204651.21677: variable '__network_connections_result' from source: set_fact 41175 1727204651.21751: handler run complete 41175 1727204651.21785: attempt loop complete, returning result 41175 1727204651.21869: _execute() done 41175 1727204651.21873: dumping result to json 41175 1727204651.21875: done dumping result, returning 41175 1727204651.21878: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-f070-39c4-000000000028] 41175 1727204651.21880: sending task result for task 12b410aa-8751-f070-39c4-000000000028 41175 1727204651.21956: done sending task result for task 12b410aa-8751-f070-39c4-000000000028 41175 1727204651.21960: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (not-active)" ] } 41175 1727204651.22046: no more pending results, returning what we have 41175 1727204651.22051: results queue empty 41175 1727204651.22052: checking for any_errors_fatal 41175 1727204651.22063: done checking for any_errors_fatal 41175 1727204651.22064: checking for max_fail_percentage 41175 1727204651.22066: done checking for max_fail_percentage 41175 1727204651.22067: checking to see if all hosts have failed and the running result is not ok 41175 1727204651.22068: done checking to see if all hosts have failed 41175 1727204651.22069: getting the remaining hosts for this loop 41175 1727204651.22071: done getting the remaining hosts for this loop 41175 1727204651.22076: getting the next task for host managed-node3 41175 1727204651.22084: done getting next task for host managed-node3 41175 1727204651.22090: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204651.22094: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204651.22108: getting variables 41175 1727204651.22110: in VariableManager get_vars() 41175 1727204651.22159: Calling all_inventory to load vars for managed-node3 41175 1727204651.22162: Calling groups_inventory to load vars for managed-node3 41175 1727204651.22165: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204651.22179: Calling all_plugins_play to load vars for managed-node3 41175 1727204651.22183: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204651.22187: Calling groups_plugins_play to load vars for managed-node3 41175 1727204651.24645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204651.27683: done with get_vars() 41175 1727204651.27725: done getting variables 41175 1727204651.27797: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.088) 0:00:18.417 ***** 41175 1727204651.27838: entering _queue_task() for managed-node3/debug 41175 1727204651.28199: worker is 1 (out of 1 available) 41175 1727204651.28212: exiting _queue_task() for managed-node3/debug 41175 1727204651.28228: done queuing things up, now waiting for results queue to drain 41175 1727204651.28230: waiting for pending results... 41175 1727204651.28613: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204651.28742: in run() - task 12b410aa-8751-f070-39c4-000000000029 41175 1727204651.28765: variable 'ansible_search_path' from source: unknown 41175 1727204651.28773: variable 'ansible_search_path' from source: unknown 41175 1727204651.28827: calling self._execute() 41175 1727204651.28933: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.28948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.28966: variable 'omit' from source: magic vars 41175 1727204651.29423: variable 'ansible_distribution_major_version' from source: facts 41175 1727204651.29443: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204651.29455: variable 'omit' from source: magic vars 41175 1727204651.29539: variable 'omit' from source: magic vars 41175 1727204651.29686: variable 'omit' from source: magic vars 41175 1727204651.29692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204651.29695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204651.29725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204651.29751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204651.29768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204651.29812: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204651.29825: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.29834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.29973: Set connection var ansible_shell_executable to /bin/sh 41175 1727204651.29982: Set connection var ansible_shell_type to sh 41175 1727204651.29997: Set connection var ansible_pipelining to False 41175 1727204651.30016: Set connection var ansible_timeout to 10 41175 1727204651.30030: Set connection var ansible_connection to ssh 41175 1727204651.30042: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204651.30071: variable 'ansible_shell_executable' from source: unknown 41175 1727204651.30080: variable 'ansible_connection' from source: unknown 41175 1727204651.30088: variable 'ansible_module_compression' from source: unknown 41175 1727204651.30098: variable 'ansible_shell_type' from source: unknown 41175 1727204651.30106: variable 'ansible_shell_executable' from source: unknown 41175 1727204651.30123: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.30195: variable 'ansible_pipelining' from source: unknown 41175 1727204651.30199: variable 'ansible_timeout' from source: unknown 41175 1727204651.30201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.30319: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204651.30342: variable 'omit' from source: magic vars 41175 1727204651.30355: starting attempt loop 41175 1727204651.30363: running the handler 41175 1727204651.30427: variable '__network_connections_result' from source: set_fact 41175 1727204651.30534: variable '__network_connections_result' from source: set_fact 41175 1727204651.30807: handler run complete 41175 1727204651.30872: attempt loop complete, returning result 41175 1727204651.30885: _execute() done 41175 1727204651.30894: dumping result to json 41175 1727204651.30991: done dumping result, returning 41175 1727204651.30996: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-f070-39c4-000000000029] 41175 1727204651.30999: sending task result for task 12b410aa-8751-f070-39c4-000000000029 41175 1727204651.31081: done sending task result for task 12b410aa-8751-f070-39c4-000000000029 41175 1727204651.31084: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (not-active)" ] } } 41175 1727204651.31235: no more pending results, returning what we have 41175 1727204651.31240: results queue empty 41175 1727204651.31241: checking for any_errors_fatal 41175 1727204651.31248: done checking for any_errors_fatal 41175 1727204651.31249: checking for max_fail_percentage 41175 1727204651.31251: done checking for max_fail_percentage 41175 1727204651.31252: checking to see if all hosts have failed and the running result is not ok 41175 1727204651.31253: done checking to see if all hosts have failed 41175 1727204651.31254: getting the remaining hosts for this loop 41175 1727204651.31262: done getting the remaining hosts for this loop 41175 1727204651.31267: getting the next task for host managed-node3 41175 1727204651.31274: done getting next task for host managed-node3 41175 1727204651.31279: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204651.31282: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204651.31499: getting variables 41175 1727204651.31501: in VariableManager get_vars() 41175 1727204651.31544: Calling all_inventory to load vars for managed-node3 41175 1727204651.31547: Calling groups_inventory to load vars for managed-node3 41175 1727204651.31550: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204651.31561: Calling all_plugins_play to load vars for managed-node3 41175 1727204651.31565: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204651.31569: Calling groups_plugins_play to load vars for managed-node3 41175 1727204651.33808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204651.36711: done with get_vars() 41175 1727204651.36743: done getting variables 41175 1727204651.36804: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.089) 0:00:18.507 ***** 41175 1727204651.36834: entering _queue_task() for managed-node3/debug 41175 1727204651.37106: worker is 1 (out of 1 available) 41175 1727204651.37124: exiting _queue_task() for managed-node3/debug 41175 1727204651.37137: done queuing things up, now waiting for results queue to drain 41175 1727204651.37139: waiting for pending results... 41175 1727204651.37338: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204651.37445: in run() - task 12b410aa-8751-f070-39c4-00000000002a 41175 1727204651.37457: variable 'ansible_search_path' from source: unknown 41175 1727204651.37461: variable 'ansible_search_path' from source: unknown 41175 1727204651.37498: calling self._execute() 41175 1727204651.37581: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.37585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.37601: variable 'omit' from source: magic vars 41175 1727204651.37927: variable 'ansible_distribution_major_version' from source: facts 41175 1727204651.37939: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204651.38049: variable 'network_state' from source: role '' defaults 41175 1727204651.38056: Evaluated conditional (network_state != {}): False 41175 1727204651.38059: when evaluation is False, skipping this task 41175 1727204651.38064: _execute() done 41175 1727204651.38069: dumping result to json 41175 1727204651.38074: done dumping result, returning 41175 1727204651.38082: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-f070-39c4-00000000002a] 41175 1727204651.38090: sending task result for task 12b410aa-8751-f070-39c4-00000000002a 41175 1727204651.38183: done sending task result for task 12b410aa-8751-f070-39c4-00000000002a 41175 1727204651.38187: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41175 1727204651.38243: no more pending results, returning what we have 41175 1727204651.38248: results queue empty 41175 1727204651.38249: checking for any_errors_fatal 41175 1727204651.38263: done checking for any_errors_fatal 41175 1727204651.38264: checking for max_fail_percentage 41175 1727204651.38266: done checking for max_fail_percentage 41175 1727204651.38267: checking to see if all hosts have failed and the running result is not ok 41175 1727204651.38268: done checking to see if all hosts have failed 41175 1727204651.38269: getting the remaining hosts for this loop 41175 1727204651.38270: done getting the remaining hosts for this loop 41175 1727204651.38275: getting the next task for host managed-node3 41175 1727204651.38282: done getting next task for host managed-node3 41175 1727204651.38287: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204651.38292: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204651.38309: getting variables 41175 1727204651.38311: in VariableManager get_vars() 41175 1727204651.38351: Calling all_inventory to load vars for managed-node3 41175 1727204651.38356: Calling groups_inventory to load vars for managed-node3 41175 1727204651.38359: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204651.38369: Calling all_plugins_play to load vars for managed-node3 41175 1727204651.38372: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204651.38375: Calling groups_plugins_play to load vars for managed-node3 41175 1727204651.40299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204651.42029: done with get_vars() 41175 1727204651.42069: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.053) 0:00:18.560 ***** 41175 1727204651.42193: entering _queue_task() for managed-node3/ping 41175 1727204651.42196: Creating lock for ping 41175 1727204651.42695: worker is 1 (out of 1 available) 41175 1727204651.42707: exiting _queue_task() for managed-node3/ping 41175 1727204651.42721: done queuing things up, now waiting for results queue to drain 41175 1727204651.42723: waiting for pending results... 41175 1727204651.43110: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204651.43140: in run() - task 12b410aa-8751-f070-39c4-00000000002b 41175 1727204651.43166: variable 'ansible_search_path' from source: unknown 41175 1727204651.43176: variable 'ansible_search_path' from source: unknown 41175 1727204651.43231: calling self._execute() 41175 1727204651.43349: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.43365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.43398: variable 'omit' from source: magic vars 41175 1727204651.43726: variable 'ansible_distribution_major_version' from source: facts 41175 1727204651.43739: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204651.43746: variable 'omit' from source: magic vars 41175 1727204651.43800: variable 'omit' from source: magic vars 41175 1727204651.43835: variable 'omit' from source: magic vars 41175 1727204651.43871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204651.43904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204651.43925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204651.43945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204651.43956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204651.43985: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204651.43989: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.43996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.44083: Set connection var ansible_shell_executable to /bin/sh 41175 1727204651.44087: Set connection var ansible_shell_type to sh 41175 1727204651.44095: Set connection var ansible_pipelining to False 41175 1727204651.44106: Set connection var ansible_timeout to 10 41175 1727204651.44111: Set connection var ansible_connection to ssh 41175 1727204651.44118: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204651.44139: variable 'ansible_shell_executable' from source: unknown 41175 1727204651.44142: variable 'ansible_connection' from source: unknown 41175 1727204651.44147: variable 'ansible_module_compression' from source: unknown 41175 1727204651.44150: variable 'ansible_shell_type' from source: unknown 41175 1727204651.44152: variable 'ansible_shell_executable' from source: unknown 41175 1727204651.44161: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204651.44164: variable 'ansible_pipelining' from source: unknown 41175 1727204651.44166: variable 'ansible_timeout' from source: unknown 41175 1727204651.44170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204651.44346: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204651.44357: variable 'omit' from source: magic vars 41175 1727204651.44363: starting attempt loop 41175 1727204651.44366: running the handler 41175 1727204651.44382: _low_level_execute_command(): starting 41175 1727204651.44391: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204651.44928: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204651.44933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204651.44937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.44987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204651.44998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204651.45038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.46795: stdout chunk (state=3): >>>/root <<< 41175 1727204651.46902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204651.46964: stderr chunk (state=3): >>><<< 41175 1727204651.46967: stdout chunk (state=3): >>><<< 41175 1727204651.46996: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204651.47002: _low_level_execute_command(): starting 41175 1727204651.47009: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488 `" && echo ansible-tmp-1727204651.469861-42028-46630234604488="` echo /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488 `" ) && sleep 0' 41175 1727204651.47457: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204651.47463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.47466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204651.47474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.47533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204651.47536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204651.47570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.49536: stdout chunk (state=3): >>>ansible-tmp-1727204651.469861-42028-46630234604488=/root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488 <<< 41175 1727204651.49655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204651.49703: stderr chunk (state=3): >>><<< 41175 1727204651.49706: stdout chunk (state=3): >>><<< 41175 1727204651.49725: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204651.469861-42028-46630234604488=/root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204651.49768: variable 'ansible_module_compression' from source: unknown 41175 1727204651.49806: ANSIBALLZ: Using lock for ping 41175 1727204651.49809: ANSIBALLZ: Acquiring lock 41175 1727204651.49812: ANSIBALLZ: Lock acquired: 140088833531664 41175 1727204651.49821: ANSIBALLZ: Creating module 41175 1727204651.64197: ANSIBALLZ: Writing module into payload 41175 1727204651.64248: ANSIBALLZ: Writing module 41175 1727204651.64268: ANSIBALLZ: Renaming module 41175 1727204651.64275: ANSIBALLZ: Done creating module 41175 1727204651.64291: variable 'ansible_facts' from source: unknown 41175 1727204651.64339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/AnsiballZ_ping.py 41175 1727204651.64454: Sending initial data 41175 1727204651.64458: Sent initial data (151 bytes) 41175 1727204651.64954: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204651.64958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.64961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204651.64963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.65025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204651.65031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204651.65040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204651.65074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.66811: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204651.66842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204651.66884: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmplz1xbj0y /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/AnsiballZ_ping.py <<< 41175 1727204651.66888: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/AnsiballZ_ping.py" <<< 41175 1727204651.66916: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmplz1xbj0y" to remote "/root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/AnsiballZ_ping.py" <<< 41175 1727204651.66925: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/AnsiballZ_ping.py" <<< 41175 1727204651.67667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204651.67746: stderr chunk (state=3): >>><<< 41175 1727204651.67750: stdout chunk (state=3): >>><<< 41175 1727204651.67773: done transferring module to remote 41175 1727204651.67783: _low_level_execute_command(): starting 41175 1727204651.67791: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/ /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/AnsiballZ_ping.py && sleep 0' 41175 1727204651.68268: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204651.68273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204651.68276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204651.68279: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204651.68287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.68338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204651.68341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204651.68380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.70300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204651.70361: stderr chunk (state=3): >>><<< 41175 1727204651.70364: stdout chunk (state=3): >>><<< 41175 1727204651.70382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204651.70385: _low_level_execute_command(): starting 41175 1727204651.70392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/AnsiballZ_ping.py && sleep 0' 41175 1727204651.71015: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204651.71058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204651.71069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204651.71151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.88586: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41175 1727204651.90143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204651.90147: stdout chunk (state=3): >>><<< 41175 1727204651.90150: stderr chunk (state=3): >>><<< 41175 1727204651.90184: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204651.90252: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204651.90256: _low_level_execute_command(): starting 41175 1727204651.90259: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204651.469861-42028-46630234604488/ > /dev/null 2>&1 && sleep 0' 41175 1727204651.91115: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204651.91152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204651.91156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204651.91159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204651.91169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204651.91286: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204651.91318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204651.91362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204651.93394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204651.93397: stderr chunk (state=3): >>><<< 41175 1727204651.93400: stdout chunk (state=3): >>><<< 41175 1727204651.93420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204651.95239: handler run complete 41175 1727204651.95243: attempt loop complete, returning result 41175 1727204651.95245: _execute() done 41175 1727204651.95247: dumping result to json 41175 1727204651.95249: done dumping result, returning 41175 1727204651.95251: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-f070-39c4-00000000002b] 41175 1727204651.95253: sending task result for task 12b410aa-8751-f070-39c4-00000000002b 41175 1727204651.95324: done sending task result for task 12b410aa-8751-f070-39c4-00000000002b 41175 1727204651.95327: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 41175 1727204651.95388: no more pending results, returning what we have 41175 1727204651.95395: results queue empty 41175 1727204651.95396: checking for any_errors_fatal 41175 1727204651.95404: done checking for any_errors_fatal 41175 1727204651.95405: checking for max_fail_percentage 41175 1727204651.95407: done checking for max_fail_percentage 41175 1727204651.95408: checking to see if all hosts have failed and the running result is not ok 41175 1727204651.95409: done checking to see if all hosts have failed 41175 1727204651.95410: getting the remaining hosts for this loop 41175 1727204651.95411: done getting the remaining hosts for this loop 41175 1727204651.95415: getting the next task for host managed-node3 41175 1727204651.95427: done getting next task for host managed-node3 41175 1727204651.95429: ^ task is: TASK: meta (role_complete) 41175 1727204651.95432: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204651.95445: getting variables 41175 1727204651.95446: in VariableManager get_vars() 41175 1727204651.95494: Calling all_inventory to load vars for managed-node3 41175 1727204651.95498: Calling groups_inventory to load vars for managed-node3 41175 1727204651.95501: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204651.95513: Calling all_plugins_play to load vars for managed-node3 41175 1727204651.95519: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204651.95524: Calling groups_plugins_play to load vars for managed-node3 41175 1727204651.97774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204652.05857: done with get_vars() 41175 1727204652.05896: done getting variables 41175 1727204652.05980: done queuing things up, now waiting for results queue to drain 41175 1727204652.05983: results queue empty 41175 1727204652.05984: checking for any_errors_fatal 41175 1727204652.05988: done checking for any_errors_fatal 41175 1727204652.05991: checking for max_fail_percentage 41175 1727204652.05992: done checking for max_fail_percentage 41175 1727204652.05993: checking to see if all hosts have failed and the running result is not ok 41175 1727204652.05994: done checking to see if all hosts have failed 41175 1727204652.05995: getting the remaining hosts for this loop 41175 1727204652.05996: done getting the remaining hosts for this loop 41175 1727204652.06000: getting the next task for host managed-node3 41175 1727204652.06005: done getting next task for host managed-node3 41175 1727204652.06008: ^ task is: TASK: Get the routes from the route table 30200 41175 1727204652.06010: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204652.06013: getting variables 41175 1727204652.06014: in VariableManager get_vars() 41175 1727204652.06036: Calling all_inventory to load vars for managed-node3 41175 1727204652.06039: Calling groups_inventory to load vars for managed-node3 41175 1727204652.06042: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204652.06048: Calling all_plugins_play to load vars for managed-node3 41175 1727204652.06051: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204652.06054: Calling groups_plugins_play to load vars for managed-node3 41175 1727204652.08028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204652.11073: done with get_vars() 41175 1727204652.11115: done getting variables 41175 1727204652.11176: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30200] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:56 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.690) 0:00:19.250 ***** 41175 1727204652.11206: entering _queue_task() for managed-node3/command 41175 1727204652.11583: worker is 1 (out of 1 available) 41175 1727204652.11601: exiting _queue_task() for managed-node3/command 41175 1727204652.11620: done queuing things up, now waiting for results queue to drain 41175 1727204652.11622: waiting for pending results... 41175 1727204652.12015: running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30200 41175 1727204652.12109: in run() - task 12b410aa-8751-f070-39c4-00000000005b 41175 1727204652.12114: variable 'ansible_search_path' from source: unknown 41175 1727204652.12118: calling self._execute() 41175 1727204652.12220: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.12236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.12256: variable 'omit' from source: magic vars 41175 1727204652.12776: variable 'ansible_distribution_major_version' from source: facts 41175 1727204652.12800: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204652.12814: variable 'omit' from source: magic vars 41175 1727204652.12843: variable 'omit' from source: magic vars 41175 1727204652.12900: variable 'omit' from source: magic vars 41175 1727204652.12981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204652.13013: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204652.13068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204652.13207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204652.13210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204652.13214: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204652.13216: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.13219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.13351: Set connection var ansible_shell_executable to /bin/sh 41175 1727204652.13361: Set connection var ansible_shell_type to sh 41175 1727204652.13373: Set connection var ansible_pipelining to False 41175 1727204652.13388: Set connection var ansible_timeout to 10 41175 1727204652.13429: Set connection var ansible_connection to ssh 41175 1727204652.13443: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204652.13475: variable 'ansible_shell_executable' from source: unknown 41175 1727204652.13484: variable 'ansible_connection' from source: unknown 41175 1727204652.13495: variable 'ansible_module_compression' from source: unknown 41175 1727204652.13525: variable 'ansible_shell_type' from source: unknown 41175 1727204652.13528: variable 'ansible_shell_executable' from source: unknown 41175 1727204652.13530: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.13532: variable 'ansible_pipelining' from source: unknown 41175 1727204652.13538: variable 'ansible_timeout' from source: unknown 41175 1727204652.13595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.13738: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204652.13795: variable 'omit' from source: magic vars 41175 1727204652.13799: starting attempt loop 41175 1727204652.13802: running the handler 41175 1727204652.13804: _low_level_execute_command(): starting 41175 1727204652.13816: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204652.14869: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.14924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.14994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.16743: stdout chunk (state=3): >>>/root <<< 41175 1727204652.16961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.16964: stdout chunk (state=3): >>><<< 41175 1727204652.16966: stderr chunk (state=3): >>><<< 41175 1727204652.16988: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.17011: _low_level_execute_command(): starting 41175 1727204652.17027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644 `" && echo ansible-tmp-1727204652.1699796-42051-231674870980644="` echo /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644 `" ) && sleep 0' 41175 1727204652.17697: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204652.17714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.17733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.17758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204652.17775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204652.17873: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.17913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.17936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.17952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.18027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.20087: stdout chunk (state=3): >>>ansible-tmp-1727204652.1699796-42051-231674870980644=/root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644 <<< 41175 1727204652.20291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.20497: stdout chunk (state=3): >>><<< 41175 1727204652.20501: stderr chunk (state=3): >>><<< 41175 1727204652.20504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204652.1699796-42051-231674870980644=/root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.20507: variable 'ansible_module_compression' from source: unknown 41175 1727204652.20509: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204652.20512: variable 'ansible_facts' from source: unknown 41175 1727204652.20757: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/AnsiballZ_command.py 41175 1727204652.21020: Sending initial data 41175 1727204652.21032: Sent initial data (156 bytes) 41175 1727204652.21573: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204652.21591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.21609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.21728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.21756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.21830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.23492: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204652.23540: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204652.23604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpn_tgatux /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/AnsiballZ_command.py <<< 41175 1727204652.23645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpn_tgatux" to remote "/root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/AnsiballZ_command.py" <<< 41175 1727204652.24686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.24810: stderr chunk (state=3): >>><<< 41175 1727204652.24822: stdout chunk (state=3): >>><<< 41175 1727204652.24857: done transferring module to remote 41175 1727204652.24884: _low_level_execute_command(): starting 41175 1727204652.24899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/ /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/AnsiballZ_command.py && sleep 0' 41175 1727204652.25568: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204652.25583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.25603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.25624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204652.25651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204652.25762: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.25787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.25865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.27782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.27801: stdout chunk (state=3): >>><<< 41175 1727204652.27813: stderr chunk (state=3): >>><<< 41175 1727204652.27838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.27931: _low_level_execute_command(): starting 41175 1727204652.27935: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/AnsiballZ_command.py && sleep 0' 41175 1727204652.28491: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204652.28507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.28529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.28547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204652.28563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204652.28604: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.28681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.28701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.28723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.28804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.46651: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-24 15:04:12.461189", "end": "2024-09-24 15:04:12.465301", "delta": "0:00:00.004112", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204652.48381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204652.48592: stderr chunk (state=3): >>><<< 41175 1727204652.48597: stdout chunk (state=3): >>><<< 41175 1727204652.48600: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-24 15:04:12.461189", "end": "2024-09-24 15:04:12.465301", "delta": "0:00:00.004112", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204652.48603: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204652.48606: _low_level_execute_command(): starting 41175 1727204652.48609: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204652.1699796-42051-231674870980644/ > /dev/null 2>&1 && sleep 0' 41175 1727204652.49180: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204652.49246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.49249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.49252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204652.49267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204652.49275: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204652.49285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.49360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.49424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.49454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.49483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.49518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.51447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.51506: stderr chunk (state=3): >>><<< 41175 1727204652.51510: stdout chunk (state=3): >>><<< 41175 1727204652.51525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.51533: handler run complete 41175 1727204652.51556: Evaluated conditional (False): False 41175 1727204652.51568: attempt loop complete, returning result 41175 1727204652.51571: _execute() done 41175 1727204652.51573: dumping result to json 41175 1727204652.51580: done dumping result, returning 41175 1727204652.51588: done running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30200 [12b410aa-8751-f070-39c4-00000000005b] 41175 1727204652.51599: sending task result for task 12b410aa-8751-f070-39c4-00000000005b 41175 1727204652.51716: done sending task result for task 12b410aa-8751-f070-39c4-00000000005b 41175 1727204652.51720: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30200" ], "delta": "0:00:00.004112", "end": "2024-09-24 15:04:12.465301", "rc": 0, "start": "2024-09-24 15:04:12.461189" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 41175 1727204652.51817: no more pending results, returning what we have 41175 1727204652.51822: results queue empty 41175 1727204652.51823: checking for any_errors_fatal 41175 1727204652.51826: done checking for any_errors_fatal 41175 1727204652.51827: checking for max_fail_percentage 41175 1727204652.51829: done checking for max_fail_percentage 41175 1727204652.51834: checking to see if all hosts have failed and the running result is not ok 41175 1727204652.51836: done checking to see if all hosts have failed 41175 1727204652.51837: getting the remaining hosts for this loop 41175 1727204652.51838: done getting the remaining hosts for this loop 41175 1727204652.51843: getting the next task for host managed-node3 41175 1727204652.51850: done getting next task for host managed-node3 41175 1727204652.51854: ^ task is: TASK: Get the routes from the route table 30400 41175 1727204652.51856: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204652.51860: getting variables 41175 1727204652.51862: in VariableManager get_vars() 41175 1727204652.51910: Calling all_inventory to load vars for managed-node3 41175 1727204652.51914: Calling groups_inventory to load vars for managed-node3 41175 1727204652.51917: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204652.51929: Calling all_plugins_play to load vars for managed-node3 41175 1727204652.51931: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204652.51935: Calling groups_plugins_play to load vars for managed-node3 41175 1727204652.53337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204652.54926: done with get_vars() 41175 1727204652.54953: done getting variables 41175 1727204652.55006: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30400] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:62 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.438) 0:00:19.689 ***** 41175 1727204652.55032: entering _queue_task() for managed-node3/command 41175 1727204652.55302: worker is 1 (out of 1 available) 41175 1727204652.55316: exiting _queue_task() for managed-node3/command 41175 1727204652.55331: done queuing things up, now waiting for results queue to drain 41175 1727204652.55333: waiting for pending results... 41175 1727204652.55558: running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30400 41175 1727204652.55643: in run() - task 12b410aa-8751-f070-39c4-00000000005c 41175 1727204652.55656: variable 'ansible_search_path' from source: unknown 41175 1727204652.55692: calling self._execute() 41175 1727204652.55779: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.55784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.55798: variable 'omit' from source: magic vars 41175 1727204652.56139: variable 'ansible_distribution_major_version' from source: facts 41175 1727204652.56152: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204652.56159: variable 'omit' from source: magic vars 41175 1727204652.56181: variable 'omit' from source: magic vars 41175 1727204652.56213: variable 'omit' from source: magic vars 41175 1727204652.56253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204652.56284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204652.56311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204652.56332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204652.56342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204652.56370: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204652.56374: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.56378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.56469: Set connection var ansible_shell_executable to /bin/sh 41175 1727204652.56472: Set connection var ansible_shell_type to sh 41175 1727204652.56479: Set connection var ansible_pipelining to False 41175 1727204652.56488: Set connection var ansible_timeout to 10 41175 1727204652.56497: Set connection var ansible_connection to ssh 41175 1727204652.56508: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204652.56529: variable 'ansible_shell_executable' from source: unknown 41175 1727204652.56532: variable 'ansible_connection' from source: unknown 41175 1727204652.56535: variable 'ansible_module_compression' from source: unknown 41175 1727204652.56538: variable 'ansible_shell_type' from source: unknown 41175 1727204652.56541: variable 'ansible_shell_executable' from source: unknown 41175 1727204652.56545: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.56596: variable 'ansible_pipelining' from source: unknown 41175 1727204652.56600: variable 'ansible_timeout' from source: unknown 41175 1727204652.56602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.56684: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204652.56697: variable 'omit' from source: magic vars 41175 1727204652.56703: starting attempt loop 41175 1727204652.56707: running the handler 41175 1727204652.56724: _low_level_execute_command(): starting 41175 1727204652.56732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204652.57266: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.57303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.57308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.57311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.57367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.57371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.57377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.57425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.59139: stdout chunk (state=3): >>>/root <<< 41175 1727204652.59245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.59309: stderr chunk (state=3): >>><<< 41175 1727204652.59313: stdout chunk (state=3): >>><<< 41175 1727204652.59341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.59356: _low_level_execute_command(): starting 41175 1727204652.59365: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693 `" && echo ansible-tmp-1727204652.5934186-42069-109679783917693="` echo /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693 `" ) && sleep 0' 41175 1727204652.59849: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.59853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.59856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.59865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.59922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.59927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.59963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.61967: stdout chunk (state=3): >>>ansible-tmp-1727204652.5934186-42069-109679783917693=/root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693 <<< 41175 1727204652.62085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.62144: stderr chunk (state=3): >>><<< 41175 1727204652.62147: stdout chunk (state=3): >>><<< 41175 1727204652.62168: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204652.5934186-42069-109679783917693=/root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.62201: variable 'ansible_module_compression' from source: unknown 41175 1727204652.62245: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204652.62287: variable 'ansible_facts' from source: unknown 41175 1727204652.62341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/AnsiballZ_command.py 41175 1727204652.62461: Sending initial data 41175 1727204652.62465: Sent initial data (156 bytes) 41175 1727204652.62943: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.62946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204652.62949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.62952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.62997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.63013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.63053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.64706: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204652.64741: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204652.64776: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpowrbvfbf /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/AnsiballZ_command.py <<< 41175 1727204652.64780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/AnsiballZ_command.py" <<< 41175 1727204652.64811: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpowrbvfbf" to remote "/root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/AnsiballZ_command.py" <<< 41175 1727204652.65572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.65640: stderr chunk (state=3): >>><<< 41175 1727204652.65644: stdout chunk (state=3): >>><<< 41175 1727204652.65666: done transferring module to remote 41175 1727204652.65678: _low_level_execute_command(): starting 41175 1727204652.65683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/ /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/AnsiballZ_command.py && sleep 0' 41175 1727204652.66155: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.66159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.66162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204652.66164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.66229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.66240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.66266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.68137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.68194: stderr chunk (state=3): >>><<< 41175 1727204652.68197: stdout chunk (state=3): >>><<< 41175 1727204652.68214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.68217: _low_level_execute_command(): starting 41175 1727204652.68227: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/AnsiballZ_command.py && sleep 0' 41175 1727204652.68685: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204652.68688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204652.68693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204652.68696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204652.68698: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204652.68751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204652.68759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.68816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.86459: stdout chunk (state=3): >>> {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-24 15:04:12.859417", "end": "2024-09-24 15:04:12.863243", "delta": "0:00:00.003826", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204652.88410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204652.88440: stderr chunk (state=3): >>><<< 41175 1727204652.88453: stdout chunk (state=3): >>><<< 41175 1727204652.88483: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-24 15:04:12.859417", "end": "2024-09-24 15:04:12.863243", "delta": "0:00:00.003826", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204652.88758: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204652.88763: _low_level_execute_command(): starting 41175 1727204652.88766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204652.5934186-42069-109679783917693/ > /dev/null 2>&1 && sleep 0' 41175 1727204652.89654: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204652.89705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204652.89724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204652.89738: stderr chunk (state=3): >>>debug2: match found <<< 41175 1727204652.89844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204652.89883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204652.89905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204652.89988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204652.91931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204652.92002: stderr chunk (state=3): >>><<< 41175 1727204652.92005: stdout chunk (state=3): >>><<< 41175 1727204652.92024: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204652.92032: handler run complete 41175 1727204652.92199: Evaluated conditional (False): False 41175 1727204652.92203: attempt loop complete, returning result 41175 1727204652.92205: _execute() done 41175 1727204652.92208: dumping result to json 41175 1727204652.92210: done dumping result, returning 41175 1727204652.92212: done running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30400 [12b410aa-8751-f070-39c4-00000000005c] 41175 1727204652.92214: sending task result for task 12b410aa-8751-f070-39c4-00000000005c 41175 1727204652.92295: done sending task result for task 12b410aa-8751-f070-39c4-00000000005c 41175 1727204652.92299: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30400" ], "delta": "0:00:00.003826", "end": "2024-09-24 15:04:12.863243", "rc": 0, "start": "2024-09-24 15:04:12.859417" } STDOUT: 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 41175 1727204652.92497: no more pending results, returning what we have 41175 1727204652.92501: results queue empty 41175 1727204652.92502: checking for any_errors_fatal 41175 1727204652.92513: done checking for any_errors_fatal 41175 1727204652.92515: checking for max_fail_percentage 41175 1727204652.92519: done checking for max_fail_percentage 41175 1727204652.92520: checking to see if all hosts have failed and the running result is not ok 41175 1727204652.92521: done checking to see if all hosts have failed 41175 1727204652.92522: getting the remaining hosts for this loop 41175 1727204652.92523: done getting the remaining hosts for this loop 41175 1727204652.92528: getting the next task for host managed-node3 41175 1727204652.92534: done getting next task for host managed-node3 41175 1727204652.92536: ^ task is: TASK: Assert that the route table 30200 contains the specified route 41175 1727204652.92538: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204652.92542: getting variables 41175 1727204652.92543: in VariableManager get_vars() 41175 1727204652.92583: Calling all_inventory to load vars for managed-node3 41175 1727204652.92586: Calling groups_inventory to load vars for managed-node3 41175 1727204652.92647: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204652.92661: Calling all_plugins_play to load vars for managed-node3 41175 1727204652.92665: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204652.92669: Calling groups_plugins_play to load vars for managed-node3 41175 1727204652.94439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204652.96609: done with get_vars() 41175 1727204652.96633: done getting variables 41175 1727204652.96713: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30200 contains the specified route] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:68 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.417) 0:00:20.106 ***** 41175 1727204652.96750: entering _queue_task() for managed-node3/assert 41175 1727204652.97101: worker is 1 (out of 1 available) 41175 1727204652.97116: exiting _queue_task() for managed-node3/assert 41175 1727204652.97130: done queuing things up, now waiting for results queue to drain 41175 1727204652.97132: waiting for pending results... 41175 1727204652.97615: running TaskExecutor() for managed-node3/TASK: Assert that the route table 30200 contains the specified route 41175 1727204652.97621: in run() - task 12b410aa-8751-f070-39c4-00000000005d 41175 1727204652.97625: variable 'ansible_search_path' from source: unknown 41175 1727204652.97637: calling self._execute() 41175 1727204652.97757: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.97772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.97837: variable 'omit' from source: magic vars 41175 1727204652.98176: variable 'ansible_distribution_major_version' from source: facts 41175 1727204652.98190: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204652.98199: variable 'omit' from source: magic vars 41175 1727204652.98219: variable 'omit' from source: magic vars 41175 1727204652.98249: variable 'omit' from source: magic vars 41175 1727204652.98285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204652.98323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204652.98340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204652.98358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204652.98369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204652.98398: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204652.98403: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.98406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.98494: Set connection var ansible_shell_executable to /bin/sh 41175 1727204652.98498: Set connection var ansible_shell_type to sh 41175 1727204652.98503: Set connection var ansible_pipelining to False 41175 1727204652.98512: Set connection var ansible_timeout to 10 41175 1727204652.98527: Set connection var ansible_connection to ssh 41175 1727204652.98530: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204652.98550: variable 'ansible_shell_executable' from source: unknown 41175 1727204652.98553: variable 'ansible_connection' from source: unknown 41175 1727204652.98556: variable 'ansible_module_compression' from source: unknown 41175 1727204652.98561: variable 'ansible_shell_type' from source: unknown 41175 1727204652.98565: variable 'ansible_shell_executable' from source: unknown 41175 1727204652.98569: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204652.98574: variable 'ansible_pipelining' from source: unknown 41175 1727204652.98577: variable 'ansible_timeout' from source: unknown 41175 1727204652.98583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204652.98704: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204652.98715: variable 'omit' from source: magic vars 41175 1727204652.98722: starting attempt loop 41175 1727204652.98724: running the handler 41175 1727204652.98873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204652.99072: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204652.99110: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204652.99173: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204652.99206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204652.99276: variable 'route_table_30200' from source: set_fact 41175 1727204652.99309: Evaluated conditional (route_table_30200.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 41175 1727204652.99427: variable 'route_table_30200' from source: set_fact 41175 1727204652.99448: Evaluated conditional (route_table_30200.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 41175 1727204652.99455: handler run complete 41175 1727204652.99469: attempt loop complete, returning result 41175 1727204652.99472: _execute() done 41175 1727204652.99475: dumping result to json 41175 1727204652.99480: done dumping result, returning 41175 1727204652.99487: done running TaskExecutor() for managed-node3/TASK: Assert that the route table 30200 contains the specified route [12b410aa-8751-f070-39c4-00000000005d] 41175 1727204652.99495: sending task result for task 12b410aa-8751-f070-39c4-00000000005d 41175 1727204652.99591: done sending task result for task 12b410aa-8751-f070-39c4-00000000005d 41175 1727204652.99594: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41175 1727204652.99665: no more pending results, returning what we have 41175 1727204652.99670: results queue empty 41175 1727204652.99671: checking for any_errors_fatal 41175 1727204652.99692: done checking for any_errors_fatal 41175 1727204652.99693: checking for max_fail_percentage 41175 1727204652.99695: done checking for max_fail_percentage 41175 1727204652.99696: checking to see if all hosts have failed and the running result is not ok 41175 1727204652.99697: done checking to see if all hosts have failed 41175 1727204652.99698: getting the remaining hosts for this loop 41175 1727204652.99699: done getting the remaining hosts for this loop 41175 1727204652.99704: getting the next task for host managed-node3 41175 1727204652.99711: done getting next task for host managed-node3 41175 1727204652.99714: ^ task is: TASK: Assert that the route table 30400 contains the specified route 41175 1727204652.99720: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204652.99723: getting variables 41175 1727204652.99726: in VariableManager get_vars() 41175 1727204652.99765: Calling all_inventory to load vars for managed-node3 41175 1727204652.99768: Calling groups_inventory to load vars for managed-node3 41175 1727204652.99771: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204652.99879: Calling all_plugins_play to load vars for managed-node3 41175 1727204652.99884: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204652.99892: Calling groups_plugins_play to load vars for managed-node3 41175 1727204653.01579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204653.03182: done with get_vars() 41175 1727204653.03211: done getting variables 41175 1727204653.03265: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30400 contains the specified route] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:76 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.065) 0:00:20.171 ***** 41175 1727204653.03298: entering _queue_task() for managed-node3/assert 41175 1727204653.03570: worker is 1 (out of 1 available) 41175 1727204653.03584: exiting _queue_task() for managed-node3/assert 41175 1727204653.03600: done queuing things up, now waiting for results queue to drain 41175 1727204653.03602: waiting for pending results... 41175 1727204653.03805: running TaskExecutor() for managed-node3/TASK: Assert that the route table 30400 contains the specified route 41175 1727204653.03884: in run() - task 12b410aa-8751-f070-39c4-00000000005e 41175 1727204653.03898: variable 'ansible_search_path' from source: unknown 41175 1727204653.03936: calling self._execute() 41175 1727204653.04027: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.04035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.04045: variable 'omit' from source: magic vars 41175 1727204653.04379: variable 'ansible_distribution_major_version' from source: facts 41175 1727204653.04394: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204653.04401: variable 'omit' from source: magic vars 41175 1727204653.04422: variable 'omit' from source: magic vars 41175 1727204653.04457: variable 'omit' from source: magic vars 41175 1727204653.04497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204653.04532: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204653.04551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204653.04567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204653.04578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204653.04609: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204653.04614: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.04617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.04705: Set connection var ansible_shell_executable to /bin/sh 41175 1727204653.04711: Set connection var ansible_shell_type to sh 41175 1727204653.04714: Set connection var ansible_pipelining to False 41175 1727204653.04729: Set connection var ansible_timeout to 10 41175 1727204653.04738: Set connection var ansible_connection to ssh 41175 1727204653.04741: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204653.04762: variable 'ansible_shell_executable' from source: unknown 41175 1727204653.04766: variable 'ansible_connection' from source: unknown 41175 1727204653.04769: variable 'ansible_module_compression' from source: unknown 41175 1727204653.04771: variable 'ansible_shell_type' from source: unknown 41175 1727204653.04776: variable 'ansible_shell_executable' from source: unknown 41175 1727204653.04780: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.04785: variable 'ansible_pipelining' from source: unknown 41175 1727204653.04791: variable 'ansible_timeout' from source: unknown 41175 1727204653.04796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.04920: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204653.04939: variable 'omit' from source: magic vars 41175 1727204653.04943: starting attempt loop 41175 1727204653.04946: running the handler 41175 1727204653.05096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204653.05304: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204653.05343: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204653.05411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204653.05444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204653.05520: variable 'route_table_30400' from source: set_fact 41175 1727204653.05547: Evaluated conditional (route_table_30400.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 41175 1727204653.05554: handler run complete 41175 1727204653.05568: attempt loop complete, returning result 41175 1727204653.05571: _execute() done 41175 1727204653.05574: dumping result to json 41175 1727204653.05578: done dumping result, returning 41175 1727204653.05586: done running TaskExecutor() for managed-node3/TASK: Assert that the route table 30400 contains the specified route [12b410aa-8751-f070-39c4-00000000005e] 41175 1727204653.05594: sending task result for task 12b410aa-8751-f070-39c4-00000000005e 41175 1727204653.05686: done sending task result for task 12b410aa-8751-f070-39c4-00000000005e 41175 1727204653.05691: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41175 1727204653.05764: no more pending results, returning what we have 41175 1727204653.05768: results queue empty 41175 1727204653.05769: checking for any_errors_fatal 41175 1727204653.05781: done checking for any_errors_fatal 41175 1727204653.05782: checking for max_fail_percentage 41175 1727204653.05784: done checking for max_fail_percentage 41175 1727204653.05785: checking to see if all hosts have failed and the running result is not ok 41175 1727204653.05786: done checking to see if all hosts have failed 41175 1727204653.05787: getting the remaining hosts for this loop 41175 1727204653.05791: done getting the remaining hosts for this loop 41175 1727204653.05796: getting the next task for host managed-node3 41175 1727204653.05801: done getting next task for host managed-node3 41175 1727204653.05804: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 41175 1727204653.05807: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204653.05811: getting variables 41175 1727204653.05813: in VariableManager get_vars() 41175 1727204653.05856: Calling all_inventory to load vars for managed-node3 41175 1727204653.05859: Calling groups_inventory to load vars for managed-node3 41175 1727204653.05861: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204653.05873: Calling all_plugins_play to load vars for managed-node3 41175 1727204653.05876: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204653.05879: Calling groups_plugins_play to load vars for managed-node3 41175 1727204653.07297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204653.08867: done with get_vars() 41175 1727204653.08891: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:82 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.056) 0:00:20.228 ***** 41175 1727204653.08968: entering _queue_task() for managed-node3/lineinfile 41175 1727204653.08970: Creating lock for lineinfile 41175 1727204653.09239: worker is 1 (out of 1 available) 41175 1727204653.09252: exiting _queue_task() for managed-node3/lineinfile 41175 1727204653.09265: done queuing things up, now waiting for results queue to drain 41175 1727204653.09267: waiting for pending results... 41175 1727204653.09467: running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 41175 1727204653.09542: in run() - task 12b410aa-8751-f070-39c4-00000000005f 41175 1727204653.09556: variable 'ansible_search_path' from source: unknown 41175 1727204653.09591: calling self._execute() 41175 1727204653.09677: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.09685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.09696: variable 'omit' from source: magic vars 41175 1727204653.10040: variable 'ansible_distribution_major_version' from source: facts 41175 1727204653.10055: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204653.10060: variable 'omit' from source: magic vars 41175 1727204653.10079: variable 'omit' from source: magic vars 41175 1727204653.10113: variable 'omit' from source: magic vars 41175 1727204653.10154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204653.10187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204653.10207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204653.10226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204653.10237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204653.10266: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204653.10269: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.10271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.10361: Set connection var ansible_shell_executable to /bin/sh 41175 1727204653.10366: Set connection var ansible_shell_type to sh 41175 1727204653.10371: Set connection var ansible_pipelining to False 41175 1727204653.10380: Set connection var ansible_timeout to 10 41175 1727204653.10387: Set connection var ansible_connection to ssh 41175 1727204653.10401: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204653.10423: variable 'ansible_shell_executable' from source: unknown 41175 1727204653.10427: variable 'ansible_connection' from source: unknown 41175 1727204653.10430: variable 'ansible_module_compression' from source: unknown 41175 1727204653.10435: variable 'ansible_shell_type' from source: unknown 41175 1727204653.10438: variable 'ansible_shell_executable' from source: unknown 41175 1727204653.10442: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.10447: variable 'ansible_pipelining' from source: unknown 41175 1727204653.10451: variable 'ansible_timeout' from source: unknown 41175 1727204653.10457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.10636: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204653.10646: variable 'omit' from source: magic vars 41175 1727204653.10653: starting attempt loop 41175 1727204653.10656: running the handler 41175 1727204653.10670: _low_level_execute_command(): starting 41175 1727204653.10679: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204653.11240: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204653.11244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204653.11248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204653.11251: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204653.11298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204653.11317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204653.11357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204653.13112: stdout chunk (state=3): >>>/root <<< 41175 1727204653.13222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204653.13287: stderr chunk (state=3): >>><<< 41175 1727204653.13294: stdout chunk (state=3): >>><<< 41175 1727204653.13319: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204653.13330: _low_level_execute_command(): starting 41175 1727204653.13337: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954 `" && echo ansible-tmp-1727204653.133167-42095-88317220810954="` echo /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954 `" ) && sleep 0' 41175 1727204653.13839: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204653.13843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204653.13846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204653.13857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204653.13859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204653.13895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204653.13918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204653.13959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204653.15952: stdout chunk (state=3): >>>ansible-tmp-1727204653.133167-42095-88317220810954=/root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954 <<< 41175 1727204653.16069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204653.16132: stderr chunk (state=3): >>><<< 41175 1727204653.16135: stdout chunk (state=3): >>><<< 41175 1727204653.16155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204653.133167-42095-88317220810954=/root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204653.16205: variable 'ansible_module_compression' from source: unknown 41175 1727204653.16256: ANSIBALLZ: Using lock for lineinfile 41175 1727204653.16260: ANSIBALLZ: Acquiring lock 41175 1727204653.16263: ANSIBALLZ: Lock acquired: 140088833531712 41175 1727204653.16265: ANSIBALLZ: Creating module 41175 1727204653.29197: ANSIBALLZ: Writing module into payload 41175 1727204653.29238: ANSIBALLZ: Writing module 41175 1727204653.29269: ANSIBALLZ: Renaming module 41175 1727204653.29281: ANSIBALLZ: Done creating module 41175 1727204653.29307: variable 'ansible_facts' from source: unknown 41175 1727204653.29394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/AnsiballZ_lineinfile.py 41175 1727204653.29662: Sending initial data 41175 1727204653.29673: Sent initial data (157 bytes) 41175 1727204653.30251: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204653.30259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204653.30345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204653.30380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204653.30452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204653.32184: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204653.32258: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204653.32305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/AnsiballZ_lineinfile.py" <<< 41175 1727204653.32337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpz7eo3f41 /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/AnsiballZ_lineinfile.py <<< 41175 1727204653.32614: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpz7eo3f41" to remote "/root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/AnsiballZ_lineinfile.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/AnsiballZ_lineinfile.py" <<< 41175 1727204653.33812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204653.33939: stderr chunk (state=3): >>><<< 41175 1727204653.33954: stdout chunk (state=3): >>><<< 41175 1727204653.33983: done transferring module to remote 41175 1727204653.34003: _low_level_execute_command(): starting 41175 1727204653.34013: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/ /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/AnsiballZ_lineinfile.py && sleep 0' 41175 1727204653.34797: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204653.34826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204653.34902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204653.36831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204653.36847: stdout chunk (state=3): >>><<< 41175 1727204653.36859: stderr chunk (state=3): >>><<< 41175 1727204653.36882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204653.36895: _low_level_execute_command(): starting 41175 1727204653.36987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/AnsiballZ_lineinfile.py && sleep 0' 41175 1727204653.37553: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204653.37570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204653.37585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204653.37610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204653.37630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204653.37745: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204653.37773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204653.37853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204653.56432: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 41175 1727204653.58044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204653.58049: stdout chunk (state=3): >>><<< 41175 1727204653.58051: stderr chunk (state=3): >>><<< 41175 1727204653.58085: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204653.58227: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204653.58230: _low_level_execute_command(): starting 41175 1727204653.58233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204653.133167-42095-88317220810954/ > /dev/null 2>&1 && sleep 0' 41175 1727204653.59260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204653.59277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204653.59295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204653.59423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204653.59454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204653.59469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204653.59549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204653.61565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204653.61581: stdout chunk (state=3): >>><<< 41175 1727204653.61595: stderr chunk (state=3): >>><<< 41175 1727204653.61616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204653.61633: handler run complete 41175 1727204653.61795: attempt loop complete, returning result 41175 1727204653.61798: _execute() done 41175 1727204653.61801: dumping result to json 41175 1727204653.61803: done dumping result, returning 41175 1727204653.61805: done running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [12b410aa-8751-f070-39c4-00000000005f] 41175 1727204653.61808: sending task result for task 12b410aa-8751-f070-39c4-00000000005f 41175 1727204653.61885: done sending task result for task 12b410aa-8751-f070-39c4-00000000005f 41175 1727204653.61891: WORKER PROCESS EXITING changed: [managed-node3] => { "backup": "", "changed": true } MSG: line added 41175 1727204653.61997: no more pending results, returning what we have 41175 1727204653.62007: results queue empty 41175 1727204653.62009: checking for any_errors_fatal 41175 1727204653.62020: done checking for any_errors_fatal 41175 1727204653.62021: checking for max_fail_percentage 41175 1727204653.62024: done checking for max_fail_percentage 41175 1727204653.62025: checking to see if all hosts have failed and the running result is not ok 41175 1727204653.62026: done checking to see if all hosts have failed 41175 1727204653.62027: getting the remaining hosts for this loop 41175 1727204653.62029: done getting the remaining hosts for this loop 41175 1727204653.62034: getting the next task for host managed-node3 41175 1727204653.62044: done getting next task for host managed-node3 41175 1727204653.62050: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204653.62054: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204653.62075: getting variables 41175 1727204653.62077: in VariableManager get_vars() 41175 1727204653.62234: Calling all_inventory to load vars for managed-node3 41175 1727204653.62238: Calling groups_inventory to load vars for managed-node3 41175 1727204653.62241: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204653.62255: Calling all_plugins_play to load vars for managed-node3 41175 1727204653.62259: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204653.62263: Calling groups_plugins_play to load vars for managed-node3 41175 1727204653.65102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204653.68446: done with get_vars() 41175 1727204653.68486: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.596) 0:00:20.824 ***** 41175 1727204653.68608: entering _queue_task() for managed-node3/include_tasks 41175 1727204653.68973: worker is 1 (out of 1 available) 41175 1727204653.68987: exiting _queue_task() for managed-node3/include_tasks 41175 1727204653.69204: done queuing things up, now waiting for results queue to drain 41175 1727204653.69207: waiting for pending results... 41175 1727204653.69524: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204653.69530: in run() - task 12b410aa-8751-f070-39c4-000000000067 41175 1727204653.69549: variable 'ansible_search_path' from source: unknown 41175 1727204653.69559: variable 'ansible_search_path' from source: unknown 41175 1727204653.69605: calling self._execute() 41175 1727204653.69727: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.69746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.69764: variable 'omit' from source: magic vars 41175 1727204653.70248: variable 'ansible_distribution_major_version' from source: facts 41175 1727204653.70275: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204653.70296: _execute() done 41175 1727204653.70305: dumping result to json 41175 1727204653.70314: done dumping result, returning 41175 1727204653.70330: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-f070-39c4-000000000067] 41175 1727204653.70379: sending task result for task 12b410aa-8751-f070-39c4-000000000067 41175 1727204653.70535: no more pending results, returning what we have 41175 1727204653.70541: in VariableManager get_vars() 41175 1727204653.70711: Calling all_inventory to load vars for managed-node3 41175 1727204653.70715: Calling groups_inventory to load vars for managed-node3 41175 1727204653.70721: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204653.70739: Calling all_plugins_play to load vars for managed-node3 41175 1727204653.70743: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204653.70747: Calling groups_plugins_play to load vars for managed-node3 41175 1727204653.71306: done sending task result for task 12b410aa-8751-f070-39c4-000000000067 41175 1727204653.71310: WORKER PROCESS EXITING 41175 1727204653.73272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204653.76441: done with get_vars() 41175 1727204653.76482: variable 'ansible_search_path' from source: unknown 41175 1727204653.76484: variable 'ansible_search_path' from source: unknown 41175 1727204653.76539: we have included files to process 41175 1727204653.76541: generating all_blocks data 41175 1727204653.76544: done generating all_blocks data 41175 1727204653.76549: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204653.76550: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204653.76553: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204653.77326: done processing included file 41175 1727204653.77333: iterating over new_blocks loaded from include file 41175 1727204653.77335: in VariableManager get_vars() 41175 1727204653.77369: done with get_vars() 41175 1727204653.77372: filtering new block on tags 41175 1727204653.77397: done filtering new block on tags 41175 1727204653.77400: in VariableManager get_vars() 41175 1727204653.77434: done with get_vars() 41175 1727204653.77441: filtering new block on tags 41175 1727204653.77469: done filtering new block on tags 41175 1727204653.77473: in VariableManager get_vars() 41175 1727204653.77506: done with get_vars() 41175 1727204653.77508: filtering new block on tags 41175 1727204653.77534: done filtering new block on tags 41175 1727204653.77537: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41175 1727204653.77543: extending task lists for all hosts with included blocks 41175 1727204653.78667: done extending task lists 41175 1727204653.78669: done processing included files 41175 1727204653.78670: results queue empty 41175 1727204653.78671: checking for any_errors_fatal 41175 1727204653.78678: done checking for any_errors_fatal 41175 1727204653.78679: checking for max_fail_percentage 41175 1727204653.78680: done checking for max_fail_percentage 41175 1727204653.78681: checking to see if all hosts have failed and the running result is not ok 41175 1727204653.78682: done checking to see if all hosts have failed 41175 1727204653.78683: getting the remaining hosts for this loop 41175 1727204653.78684: done getting the remaining hosts for this loop 41175 1727204653.78687: getting the next task for host managed-node3 41175 1727204653.78694: done getting next task for host managed-node3 41175 1727204653.78696: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204653.78699: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204653.78710: getting variables 41175 1727204653.78711: in VariableManager get_vars() 41175 1727204653.78734: Calling all_inventory to load vars for managed-node3 41175 1727204653.78737: Calling groups_inventory to load vars for managed-node3 41175 1727204653.78740: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204653.78753: Calling all_plugins_play to load vars for managed-node3 41175 1727204653.78757: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204653.78762: Calling groups_plugins_play to load vars for managed-node3 41175 1727204653.81295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204653.83062: done with get_vars() 41175 1727204653.83086: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.145) 0:00:20.970 ***** 41175 1727204653.83158: entering _queue_task() for managed-node3/setup 41175 1727204653.83439: worker is 1 (out of 1 available) 41175 1727204653.83453: exiting _queue_task() for managed-node3/setup 41175 1727204653.83465: done queuing things up, now waiting for results queue to drain 41175 1727204653.83467: waiting for pending results... 41175 1727204653.83721: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204653.83932: in run() - task 12b410aa-8751-f070-39c4-0000000005df 41175 1727204653.83958: variable 'ansible_search_path' from source: unknown 41175 1727204653.83968: variable 'ansible_search_path' from source: unknown 41175 1727204653.84021: calling self._execute() 41175 1727204653.84135: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.84149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.84165: variable 'omit' from source: magic vars 41175 1727204653.84604: variable 'ansible_distribution_major_version' from source: facts 41175 1727204653.84628: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204653.84915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204653.86660: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204653.86723: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204653.86759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204653.86791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204653.86814: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204653.86888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204653.86914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204653.86938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204653.86975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204653.86988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204653.87039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204653.87062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204653.87082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204653.87115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204653.87131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204653.87261: variable '__network_required_facts' from source: role '' defaults 41175 1727204653.87271: variable 'ansible_facts' from source: unknown 41175 1727204653.87976: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41175 1727204653.87981: when evaluation is False, skipping this task 41175 1727204653.87983: _execute() done 41175 1727204653.87986: dumping result to json 41175 1727204653.87991: done dumping result, returning 41175 1727204653.88000: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-f070-39c4-0000000005df] 41175 1727204653.88005: sending task result for task 12b410aa-8751-f070-39c4-0000000005df 41175 1727204653.88104: done sending task result for task 12b410aa-8751-f070-39c4-0000000005df 41175 1727204653.88107: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204653.88157: no more pending results, returning what we have 41175 1727204653.88162: results queue empty 41175 1727204653.88163: checking for any_errors_fatal 41175 1727204653.88165: done checking for any_errors_fatal 41175 1727204653.88166: checking for max_fail_percentage 41175 1727204653.88167: done checking for max_fail_percentage 41175 1727204653.88168: checking to see if all hosts have failed and the running result is not ok 41175 1727204653.88169: done checking to see if all hosts have failed 41175 1727204653.88170: getting the remaining hosts for this loop 41175 1727204653.88172: done getting the remaining hosts for this loop 41175 1727204653.88176: getting the next task for host managed-node3 41175 1727204653.88187: done getting next task for host managed-node3 41175 1727204653.88195: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204653.88200: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204653.88224: getting variables 41175 1727204653.88226: in VariableManager get_vars() 41175 1727204653.88274: Calling all_inventory to load vars for managed-node3 41175 1727204653.88277: Calling groups_inventory to load vars for managed-node3 41175 1727204653.88280: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204653.88298: Calling all_plugins_play to load vars for managed-node3 41175 1727204653.88302: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204653.88306: Calling groups_plugins_play to load vars for managed-node3 41175 1727204653.89571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204653.91279: done with get_vars() 41175 1727204653.91304: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.082) 0:00:21.052 ***** 41175 1727204653.91396: entering _queue_task() for managed-node3/stat 41175 1727204653.91651: worker is 1 (out of 1 available) 41175 1727204653.91665: exiting _queue_task() for managed-node3/stat 41175 1727204653.91677: done queuing things up, now waiting for results queue to drain 41175 1727204653.91679: waiting for pending results... 41175 1727204653.92209: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204653.92291: in run() - task 12b410aa-8751-f070-39c4-0000000005e1 41175 1727204653.92376: variable 'ansible_search_path' from source: unknown 41175 1727204653.92381: variable 'ansible_search_path' from source: unknown 41175 1727204653.92385: calling self._execute() 41175 1727204653.92504: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204653.92522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204653.92541: variable 'omit' from source: magic vars 41175 1727204653.93183: variable 'ansible_distribution_major_version' from source: facts 41175 1727204653.93187: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204653.93295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204653.93638: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204653.93701: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204653.93755: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204653.93807: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204653.93924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204653.93974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204653.94018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204653.94059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204653.94197: variable '__network_is_ostree' from source: set_fact 41175 1727204653.94213: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204653.94226: when evaluation is False, skipping this task 41175 1727204653.94238: _execute() done 41175 1727204653.94248: dumping result to json 41175 1727204653.94259: done dumping result, returning 41175 1727204653.94274: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-f070-39c4-0000000005e1] 41175 1727204653.94295: sending task result for task 12b410aa-8751-f070-39c4-0000000005e1 41175 1727204653.94523: done sending task result for task 12b410aa-8751-f070-39c4-0000000005e1 41175 1727204653.94528: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204653.94600: no more pending results, returning what we have 41175 1727204653.94604: results queue empty 41175 1727204653.94606: checking for any_errors_fatal 41175 1727204653.94613: done checking for any_errors_fatal 41175 1727204653.94614: checking for max_fail_percentage 41175 1727204653.94615: done checking for max_fail_percentage 41175 1727204653.94618: checking to see if all hosts have failed and the running result is not ok 41175 1727204653.94620: done checking to see if all hosts have failed 41175 1727204653.94621: getting the remaining hosts for this loop 41175 1727204653.94622: done getting the remaining hosts for this loop 41175 1727204653.94627: getting the next task for host managed-node3 41175 1727204653.94634: done getting next task for host managed-node3 41175 1727204653.94639: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204653.94643: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204653.94660: getting variables 41175 1727204653.94662: in VariableManager get_vars() 41175 1727204653.94750: Calling all_inventory to load vars for managed-node3 41175 1727204653.94753: Calling groups_inventory to load vars for managed-node3 41175 1727204653.94756: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204653.94766: Calling all_plugins_play to load vars for managed-node3 41175 1727204653.94769: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204653.94773: Calling groups_plugins_play to load vars for managed-node3 41175 1727204653.97686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204653.99691: done with get_vars() 41175 1727204653.99737: done getting variables 41175 1727204653.99814: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.084) 0:00:21.137 ***** 41175 1727204653.99861: entering _queue_task() for managed-node3/set_fact 41175 1727204654.00241: worker is 1 (out of 1 available) 41175 1727204654.00255: exiting _queue_task() for managed-node3/set_fact 41175 1727204654.00270: done queuing things up, now waiting for results queue to drain 41175 1727204654.00272: waiting for pending results... 41175 1727204654.00528: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204654.00739: in run() - task 12b410aa-8751-f070-39c4-0000000005e2 41175 1727204654.00827: variable 'ansible_search_path' from source: unknown 41175 1727204654.00830: variable 'ansible_search_path' from source: unknown 41175 1727204654.00833: calling self._execute() 41175 1727204654.00931: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204654.00951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204654.00973: variable 'omit' from source: magic vars 41175 1727204654.01403: variable 'ansible_distribution_major_version' from source: facts 41175 1727204654.01422: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204654.01569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204654.01793: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204654.01836: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204654.01865: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204654.01896: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204654.01972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204654.01995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204654.02018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204654.02045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204654.02120: variable '__network_is_ostree' from source: set_fact 41175 1727204654.02131: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204654.02134: when evaluation is False, skipping this task 41175 1727204654.02137: _execute() done 41175 1727204654.02141: dumping result to json 41175 1727204654.02144: done dumping result, returning 41175 1727204654.02155: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-f070-39c4-0000000005e2] 41175 1727204654.02158: sending task result for task 12b410aa-8751-f070-39c4-0000000005e2 41175 1727204654.02245: done sending task result for task 12b410aa-8751-f070-39c4-0000000005e2 41175 1727204654.02249: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204654.02308: no more pending results, returning what we have 41175 1727204654.02313: results queue empty 41175 1727204654.02314: checking for any_errors_fatal 41175 1727204654.02321: done checking for any_errors_fatal 41175 1727204654.02322: checking for max_fail_percentage 41175 1727204654.02323: done checking for max_fail_percentage 41175 1727204654.02324: checking to see if all hosts have failed and the running result is not ok 41175 1727204654.02325: done checking to see if all hosts have failed 41175 1727204654.02326: getting the remaining hosts for this loop 41175 1727204654.02328: done getting the remaining hosts for this loop 41175 1727204654.02333: getting the next task for host managed-node3 41175 1727204654.02343: done getting next task for host managed-node3 41175 1727204654.02347: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204654.02352: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204654.02380: getting variables 41175 1727204654.02383: in VariableManager get_vars() 41175 1727204654.02426: Calling all_inventory to load vars for managed-node3 41175 1727204654.02430: Calling groups_inventory to load vars for managed-node3 41175 1727204654.02432: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204654.02442: Calling all_plugins_play to load vars for managed-node3 41175 1727204654.02445: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204654.02449: Calling groups_plugins_play to load vars for managed-node3 41175 1727204654.04349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204654.05930: done with get_vars() 41175 1727204654.05953: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:14 -0400 (0:00:00.061) 0:00:21.199 ***** 41175 1727204654.06034: entering _queue_task() for managed-node3/service_facts 41175 1727204654.06271: worker is 1 (out of 1 available) 41175 1727204654.06286: exiting _queue_task() for managed-node3/service_facts 41175 1727204654.06300: done queuing things up, now waiting for results queue to drain 41175 1727204654.06302: waiting for pending results... 41175 1727204654.06503: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204654.06619: in run() - task 12b410aa-8751-f070-39c4-0000000005e4 41175 1727204654.06639: variable 'ansible_search_path' from source: unknown 41175 1727204654.06643: variable 'ansible_search_path' from source: unknown 41175 1727204654.06673: calling self._execute() 41175 1727204654.06756: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204654.06761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204654.06773: variable 'omit' from source: magic vars 41175 1727204654.07091: variable 'ansible_distribution_major_version' from source: facts 41175 1727204654.07104: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204654.07111: variable 'omit' from source: magic vars 41175 1727204654.07174: variable 'omit' from source: magic vars 41175 1727204654.07206: variable 'omit' from source: magic vars 41175 1727204654.07246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204654.07277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204654.07300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204654.07323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204654.07333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204654.07362: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204654.07365: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204654.07369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204654.07460: Set connection var ansible_shell_executable to /bin/sh 41175 1727204654.07464: Set connection var ansible_shell_type to sh 41175 1727204654.07470: Set connection var ansible_pipelining to False 41175 1727204654.07479: Set connection var ansible_timeout to 10 41175 1727204654.07485: Set connection var ansible_connection to ssh 41175 1727204654.07493: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204654.07519: variable 'ansible_shell_executable' from source: unknown 41175 1727204654.07522: variable 'ansible_connection' from source: unknown 41175 1727204654.07526: variable 'ansible_module_compression' from source: unknown 41175 1727204654.07530: variable 'ansible_shell_type' from source: unknown 41175 1727204654.07532: variable 'ansible_shell_executable' from source: unknown 41175 1727204654.07535: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204654.07537: variable 'ansible_pipelining' from source: unknown 41175 1727204654.07544: variable 'ansible_timeout' from source: unknown 41175 1727204654.07546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204654.07715: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204654.07728: variable 'omit' from source: magic vars 41175 1727204654.07733: starting attempt loop 41175 1727204654.07736: running the handler 41175 1727204654.07749: _low_level_execute_command(): starting 41175 1727204654.07759: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204654.08302: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204654.08306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204654.08310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204654.08368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204654.08377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204654.08379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204654.08420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204654.10201: stdout chunk (state=3): >>>/root <<< 41175 1727204654.10355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204654.10369: stderr chunk (state=3): >>><<< 41175 1727204654.10372: stdout chunk (state=3): >>><<< 41175 1727204654.10392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204654.10406: _low_level_execute_command(): starting 41175 1727204654.10415: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477 `" && echo ansible-tmp-1727204654.1039338-42129-260727160763477="` echo /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477 `" ) && sleep 0' 41175 1727204654.10901: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204654.10907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204654.10910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204654.10921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204654.10968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204654.10972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204654.11016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204654.13009: stdout chunk (state=3): >>>ansible-tmp-1727204654.1039338-42129-260727160763477=/root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477 <<< 41175 1727204654.13128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204654.13178: stderr chunk (state=3): >>><<< 41175 1727204654.13183: stdout chunk (state=3): >>><<< 41175 1727204654.13205: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204654.1039338-42129-260727160763477=/root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204654.13248: variable 'ansible_module_compression' from source: unknown 41175 1727204654.13284: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41175 1727204654.13325: variable 'ansible_facts' from source: unknown 41175 1727204654.13384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/AnsiballZ_service_facts.py 41175 1727204654.13500: Sending initial data 41175 1727204654.13504: Sent initial data (162 bytes) 41175 1727204654.13969: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204654.13972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204654.13975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204654.13978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204654.14035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204654.14041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204654.14075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204654.15714: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41175 1727204654.15725: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204654.15745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204654.15781: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpoacj0nxt /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/AnsiballZ_service_facts.py <<< 41175 1727204654.15784: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/AnsiballZ_service_facts.py" <<< 41175 1727204654.15814: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpoacj0nxt" to remote "/root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/AnsiballZ_service_facts.py" <<< 41175 1727204654.15823: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/AnsiballZ_service_facts.py" <<< 41175 1727204654.16612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204654.16675: stderr chunk (state=3): >>><<< 41175 1727204654.16679: stdout chunk (state=3): >>><<< 41175 1727204654.16708: done transferring module to remote 41175 1727204654.16711: _low_level_execute_command(): starting 41175 1727204654.16717: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/ /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/AnsiballZ_service_facts.py && sleep 0' 41175 1727204654.17150: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204654.17188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204654.17195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204654.17198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204654.17200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204654.17202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204654.17254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204654.17259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204654.17298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204654.19209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204654.19258: stderr chunk (state=3): >>><<< 41175 1727204654.19262: stdout chunk (state=3): >>><<< 41175 1727204654.19275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204654.19279: _low_level_execute_command(): starting 41175 1727204654.19286: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/AnsiballZ_service_facts.py && sleep 0' 41175 1727204654.19752: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204654.19755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204654.19758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204654.19760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204654.19763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204654.19812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204654.19816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204654.19867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204656.21228: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41175 1727204656.22899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204656.23038: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 41175 1727204656.23098: stdout chunk (state=3): >>><<< 41175 1727204656.23101: stderr chunk (state=3): >>><<< 41175 1727204656.23106: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204656.25482: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204656.25614: _low_level_execute_command(): starting 41175 1727204656.25621: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204654.1039338-42129-260727160763477/ > /dev/null 2>&1 && sleep 0' 41175 1727204656.27096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204656.27100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.27103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204656.27106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204656.27261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.27334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204656.27411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204656.27473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204656.29512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204656.29608: stderr chunk (state=3): >>><<< 41175 1727204656.29612: stdout chunk (state=3): >>><<< 41175 1727204656.29633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204656.29641: handler run complete 41175 1727204656.30296: variable 'ansible_facts' from source: unknown 41175 1727204656.30797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204656.32596: variable 'ansible_facts' from source: unknown 41175 1727204656.32905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204656.33643: attempt loop complete, returning result 41175 1727204656.33650: _execute() done 41175 1727204656.33654: dumping result to json 41175 1727204656.33826: done dumping result, returning 41175 1727204656.33838: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-f070-39c4-0000000005e4] 41175 1727204656.33845: sending task result for task 12b410aa-8751-f070-39c4-0000000005e4 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204656.36372: no more pending results, returning what we have 41175 1727204656.36376: results queue empty 41175 1727204656.36377: checking for any_errors_fatal 41175 1727204656.36383: done checking for any_errors_fatal 41175 1727204656.36384: checking for max_fail_percentage 41175 1727204656.36386: done checking for max_fail_percentage 41175 1727204656.36387: checking to see if all hosts have failed and the running result is not ok 41175 1727204656.36388: done checking to see if all hosts have failed 41175 1727204656.36391: getting the remaining hosts for this loop 41175 1727204656.36392: done getting the remaining hosts for this loop 41175 1727204656.36397: getting the next task for host managed-node3 41175 1727204656.36403: done getting next task for host managed-node3 41175 1727204656.36407: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204656.36411: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204656.36427: getting variables 41175 1727204656.36428: in VariableManager get_vars() 41175 1727204656.36466: Calling all_inventory to load vars for managed-node3 41175 1727204656.36470: Calling groups_inventory to load vars for managed-node3 41175 1727204656.36472: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204656.36483: Calling all_plugins_play to load vars for managed-node3 41175 1727204656.36486: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204656.36608: Calling groups_plugins_play to load vars for managed-node3 41175 1727204656.36626: done sending task result for task 12b410aa-8751-f070-39c4-0000000005e4 41175 1727204656.36629: WORKER PROCESS EXITING 41175 1727204656.41434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204656.48269: done with get_vars() 41175 1727204656.48457: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:16 -0400 (0:00:02.426) 0:00:23.625 ***** 41175 1727204656.48752: entering _queue_task() for managed-node3/package_facts 41175 1727204656.49622: worker is 1 (out of 1 available) 41175 1727204656.49639: exiting _queue_task() for managed-node3/package_facts 41175 1727204656.49680: done queuing things up, now waiting for results queue to drain 41175 1727204656.49683: waiting for pending results... 41175 1727204656.49995: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204656.50165: in run() - task 12b410aa-8751-f070-39c4-0000000005e5 41175 1727204656.50182: variable 'ansible_search_path' from source: unknown 41175 1727204656.50207: variable 'ansible_search_path' from source: unknown 41175 1727204656.50232: calling self._execute() 41175 1727204656.50446: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204656.50451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204656.50453: variable 'omit' from source: magic vars 41175 1727204656.50813: variable 'ansible_distribution_major_version' from source: facts 41175 1727204656.50827: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204656.50834: variable 'omit' from source: magic vars 41175 1727204656.50941: variable 'omit' from source: magic vars 41175 1727204656.50988: variable 'omit' from source: magic vars 41175 1727204656.51035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204656.51081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204656.51105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204656.51129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204656.51143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204656.51182: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204656.51186: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204656.51193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204656.51328: Set connection var ansible_shell_executable to /bin/sh 41175 1727204656.51331: Set connection var ansible_shell_type to sh 41175 1727204656.51334: Set connection var ansible_pipelining to False 41175 1727204656.51346: Set connection var ansible_timeout to 10 41175 1727204656.51353: Set connection var ansible_connection to ssh 41175 1727204656.51361: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204656.51394: variable 'ansible_shell_executable' from source: unknown 41175 1727204656.51397: variable 'ansible_connection' from source: unknown 41175 1727204656.51400: variable 'ansible_module_compression' from source: unknown 41175 1727204656.51405: variable 'ansible_shell_type' from source: unknown 41175 1727204656.51408: variable 'ansible_shell_executable' from source: unknown 41175 1727204656.51413: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204656.51423: variable 'ansible_pipelining' from source: unknown 41175 1727204656.51427: variable 'ansible_timeout' from source: unknown 41175 1727204656.51436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204656.51950: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204656.51955: variable 'omit' from source: magic vars 41175 1727204656.51958: starting attempt loop 41175 1727204656.51961: running the handler 41175 1727204656.51965: _low_level_execute_command(): starting 41175 1727204656.51967: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204656.52772: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204656.52776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204656.52786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204656.52807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204656.52860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.52926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204656.52938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204656.53002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204656.53034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204656.54782: stdout chunk (state=3): >>>/root <<< 41175 1727204656.54883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204656.54950: stderr chunk (state=3): >>><<< 41175 1727204656.54953: stdout chunk (state=3): >>><<< 41175 1727204656.54979: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204656.54996: _low_level_execute_command(): starting 41175 1727204656.55000: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545 `" && echo ansible-tmp-1727204656.549776-42302-17606225985545="` echo /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545 `" ) && sleep 0' 41175 1727204656.55595: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204656.55607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204656.55628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204656.55633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204656.55694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204656.55704: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204656.55707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.55715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204656.55718: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204656.55722: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41175 1727204656.55725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204656.55727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204656.55739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204656.55742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204656.55749: stderr chunk (state=3): >>>debug2: match found <<< 41175 1727204656.55825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.55840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204656.55857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204656.55876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204656.55933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204656.58226: stdout chunk (state=3): >>>ansible-tmp-1727204656.549776-42302-17606225985545=/root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545 <<< 41175 1727204656.58233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204656.58236: stderr chunk (state=3): >>><<< 41175 1727204656.58239: stdout chunk (state=3): >>><<< 41175 1727204656.58257: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204656.549776-42302-17606225985545=/root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204656.58309: variable 'ansible_module_compression' from source: unknown 41175 1727204656.58363: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41175 1727204656.58917: variable 'ansible_facts' from source: unknown 41175 1727204656.59145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/AnsiballZ_package_facts.py 41175 1727204656.59473: Sending initial data 41175 1727204656.59477: Sent initial data (160 bytes) 41175 1727204656.60208: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.60353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204656.60382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204656.62131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204656.62164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204656.62226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpfto9_5si /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/AnsiballZ_package_facts.py <<< 41175 1727204656.62229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/AnsiballZ_package_facts.py" <<< 41175 1727204656.62324: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpfto9_5si" to remote "/root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/AnsiballZ_package_facts.py" <<< 41175 1727204656.66865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204656.66994: stderr chunk (state=3): >>><<< 41175 1727204656.67050: stdout chunk (state=3): >>><<< 41175 1727204656.67127: done transferring module to remote 41175 1727204656.67254: _low_level_execute_command(): starting 41175 1727204656.67257: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/ /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/AnsiballZ_package_facts.py && sleep 0' 41175 1727204656.68671: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.68815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204656.68834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204656.68867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204656.69053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204656.71066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204656.71110: stdout chunk (state=3): >>><<< 41175 1727204656.71113: stderr chunk (state=3): >>><<< 41175 1727204656.71333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204656.71337: _low_level_execute_command(): starting 41175 1727204656.71341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/AnsiballZ_package_facts.py && sleep 0' 41175 1727204656.71929: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204656.71944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204656.71964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204656.72045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204656.72100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204656.72123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204656.72161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204656.72286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204657.35825: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 41175 1727204657.35846: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 41175 1727204657.35858: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 41175 1727204657.35880: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 41175 1727204657.35920: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 41175 1727204657.35934: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 41175 1727204657.35962: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 41175 1727204657.35969: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 41175 1727204657.36005: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 41175 1727204657.36035: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 41175 1727204657.36041: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 41175 1727204657.36059: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 41175 1727204657.36075: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 41175 1727204657.36084: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41175 1727204657.38012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204657.38079: stderr chunk (state=3): >>><<< 41175 1727204657.38082: stdout chunk (state=3): >>><<< 41175 1727204657.38128: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204657.40422: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204657.40444: _low_level_execute_command(): starting 41175 1727204657.40449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204656.549776-42302-17606225985545/ > /dev/null 2>&1 && sleep 0' 41175 1727204657.40958: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204657.40962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204657.40965: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204657.40967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204657.40969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204657.41030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204657.41033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204657.41036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204657.41077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204657.43004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204657.43064: stderr chunk (state=3): >>><<< 41175 1727204657.43068: stdout chunk (state=3): >>><<< 41175 1727204657.43081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204657.43091: handler run complete 41175 1727204657.44095: variable 'ansible_facts' from source: unknown 41175 1727204657.44911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204657.47048: variable 'ansible_facts' from source: unknown 41175 1727204657.47514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204657.48869: attempt loop complete, returning result 41175 1727204657.48873: _execute() done 41175 1727204657.48876: dumping result to json 41175 1727204657.49096: done dumping result, returning 41175 1727204657.49106: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-f070-39c4-0000000005e5] 41175 1727204657.49117: sending task result for task 12b410aa-8751-f070-39c4-0000000005e5 41175 1727204657.58075: done sending task result for task 12b410aa-8751-f070-39c4-0000000005e5 41175 1727204657.58079: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204657.58267: no more pending results, returning what we have 41175 1727204657.58271: results queue empty 41175 1727204657.58273: checking for any_errors_fatal 41175 1727204657.58279: done checking for any_errors_fatal 41175 1727204657.58280: checking for max_fail_percentage 41175 1727204657.58282: done checking for max_fail_percentage 41175 1727204657.58283: checking to see if all hosts have failed and the running result is not ok 41175 1727204657.58284: done checking to see if all hosts have failed 41175 1727204657.58285: getting the remaining hosts for this loop 41175 1727204657.58287: done getting the remaining hosts for this loop 41175 1727204657.58293: getting the next task for host managed-node3 41175 1727204657.58301: done getting next task for host managed-node3 41175 1727204657.58306: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204657.58309: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204657.58326: getting variables 41175 1727204657.58328: in VariableManager get_vars() 41175 1727204657.58380: Calling all_inventory to load vars for managed-node3 41175 1727204657.58384: Calling groups_inventory to load vars for managed-node3 41175 1727204657.58387: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204657.58401: Calling all_plugins_play to load vars for managed-node3 41175 1727204657.58404: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204657.58408: Calling groups_plugins_play to load vars for managed-node3 41175 1727204657.60830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204657.64147: done with get_vars() 41175 1727204657.64215: done getting variables 41175 1727204657.64300: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:17 -0400 (0:00:01.156) 0:00:24.782 ***** 41175 1727204657.64357: entering _queue_task() for managed-node3/debug 41175 1727204657.64799: worker is 1 (out of 1 available) 41175 1727204657.64814: exiting _queue_task() for managed-node3/debug 41175 1727204657.64830: done queuing things up, now waiting for results queue to drain 41175 1727204657.64832: waiting for pending results... 41175 1727204657.65232: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204657.65386: in run() - task 12b410aa-8751-f070-39c4-000000000068 41175 1727204657.65411: variable 'ansible_search_path' from source: unknown 41175 1727204657.65439: variable 'ansible_search_path' from source: unknown 41175 1727204657.65496: calling self._execute() 41175 1727204657.65611: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204657.65646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204657.65698: variable 'omit' from source: magic vars 41175 1727204657.66196: variable 'ansible_distribution_major_version' from source: facts 41175 1727204657.66243: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204657.66246: variable 'omit' from source: magic vars 41175 1727204657.66325: variable 'omit' from source: magic vars 41175 1727204657.66474: variable 'network_provider' from source: set_fact 41175 1727204657.66522: variable 'omit' from source: magic vars 41175 1727204657.66575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204657.66639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204657.66677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204657.66696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204657.66737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204657.66772: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204657.66895: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204657.66901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204657.66953: Set connection var ansible_shell_executable to /bin/sh 41175 1727204657.66963: Set connection var ansible_shell_type to sh 41175 1727204657.66976: Set connection var ansible_pipelining to False 41175 1727204657.66996: Set connection var ansible_timeout to 10 41175 1727204657.67026: Set connection var ansible_connection to ssh 41175 1727204657.67041: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204657.67072: variable 'ansible_shell_executable' from source: unknown 41175 1727204657.67115: variable 'ansible_connection' from source: unknown 41175 1727204657.67129: variable 'ansible_module_compression' from source: unknown 41175 1727204657.67137: variable 'ansible_shell_type' from source: unknown 41175 1727204657.67140: variable 'ansible_shell_executable' from source: unknown 41175 1727204657.67142: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204657.67145: variable 'ansible_pipelining' from source: unknown 41175 1727204657.67147: variable 'ansible_timeout' from source: unknown 41175 1727204657.67239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204657.67368: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204657.67392: variable 'omit' from source: magic vars 41175 1727204657.67405: starting attempt loop 41175 1727204657.67413: running the handler 41175 1727204657.67490: handler run complete 41175 1727204657.67568: attempt loop complete, returning result 41175 1727204657.67576: _execute() done 41175 1727204657.67579: dumping result to json 41175 1727204657.67582: done dumping result, returning 41175 1727204657.67584: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-f070-39c4-000000000068] 41175 1727204657.67586: sending task result for task 12b410aa-8751-f070-39c4-000000000068 41175 1727204657.67756: done sending task result for task 12b410aa-8751-f070-39c4-000000000068 41175 1727204657.67760: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 41175 1727204657.67930: no more pending results, returning what we have 41175 1727204657.67934: results queue empty 41175 1727204657.67936: checking for any_errors_fatal 41175 1727204657.67951: done checking for any_errors_fatal 41175 1727204657.67952: checking for max_fail_percentage 41175 1727204657.67954: done checking for max_fail_percentage 41175 1727204657.67955: checking to see if all hosts have failed and the running result is not ok 41175 1727204657.67956: done checking to see if all hosts have failed 41175 1727204657.67957: getting the remaining hosts for this loop 41175 1727204657.67959: done getting the remaining hosts for this loop 41175 1727204657.67965: getting the next task for host managed-node3 41175 1727204657.67973: done getting next task for host managed-node3 41175 1727204657.67978: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204657.67982: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204657.67999: getting variables 41175 1727204657.68001: in VariableManager get_vars() 41175 1727204657.68054: Calling all_inventory to load vars for managed-node3 41175 1727204657.68058: Calling groups_inventory to load vars for managed-node3 41175 1727204657.68061: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204657.68075: Calling all_plugins_play to load vars for managed-node3 41175 1727204657.68078: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204657.68082: Calling groups_plugins_play to load vars for managed-node3 41175 1727204657.75380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204657.78843: done with get_vars() 41175 1727204657.78893: done getting variables 41175 1727204657.79001: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:17 -0400 (0:00:00.146) 0:00:24.929 ***** 41175 1727204657.79035: entering _queue_task() for managed-node3/fail 41175 1727204657.79432: worker is 1 (out of 1 available) 41175 1727204657.79448: exiting _queue_task() for managed-node3/fail 41175 1727204657.79462: done queuing things up, now waiting for results queue to drain 41175 1727204657.79465: waiting for pending results... 41175 1727204657.79919: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204657.79981: in run() - task 12b410aa-8751-f070-39c4-000000000069 41175 1727204657.80002: variable 'ansible_search_path' from source: unknown 41175 1727204657.80005: variable 'ansible_search_path' from source: unknown 41175 1727204657.80048: calling self._execute() 41175 1727204657.80150: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204657.80156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204657.80164: variable 'omit' from source: magic vars 41175 1727204657.80504: variable 'ansible_distribution_major_version' from source: facts 41175 1727204657.80515: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204657.80623: variable 'network_state' from source: role '' defaults 41175 1727204657.80633: Evaluated conditional (network_state != {}): False 41175 1727204657.80636: when evaluation is False, skipping this task 41175 1727204657.80639: _execute() done 41175 1727204657.80644: dumping result to json 41175 1727204657.80649: done dumping result, returning 41175 1727204657.80657: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-f070-39c4-000000000069] 41175 1727204657.80664: sending task result for task 12b410aa-8751-f070-39c4-000000000069 41175 1727204657.80765: done sending task result for task 12b410aa-8751-f070-39c4-000000000069 41175 1727204657.80768: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204657.80822: no more pending results, returning what we have 41175 1727204657.80827: results queue empty 41175 1727204657.80828: checking for any_errors_fatal 41175 1727204657.80836: done checking for any_errors_fatal 41175 1727204657.80837: checking for max_fail_percentage 41175 1727204657.80839: done checking for max_fail_percentage 41175 1727204657.80840: checking to see if all hosts have failed and the running result is not ok 41175 1727204657.80841: done checking to see if all hosts have failed 41175 1727204657.80842: getting the remaining hosts for this loop 41175 1727204657.80844: done getting the remaining hosts for this loop 41175 1727204657.80848: getting the next task for host managed-node3 41175 1727204657.80855: done getting next task for host managed-node3 41175 1727204657.80858: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204657.80862: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204657.80883: getting variables 41175 1727204657.80885: in VariableManager get_vars() 41175 1727204657.80930: Calling all_inventory to load vars for managed-node3 41175 1727204657.80933: Calling groups_inventory to load vars for managed-node3 41175 1727204657.80936: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204657.80947: Calling all_plugins_play to load vars for managed-node3 41175 1727204657.80950: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204657.80953: Calling groups_plugins_play to load vars for managed-node3 41175 1727204657.82909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204657.86393: done with get_vars() 41175 1727204657.86428: done getting variables 41175 1727204657.86508: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:17 -0400 (0:00:00.075) 0:00:25.004 ***** 41175 1727204657.86549: entering _queue_task() for managed-node3/fail 41175 1727204657.86939: worker is 1 (out of 1 available) 41175 1727204657.86954: exiting _queue_task() for managed-node3/fail 41175 1727204657.86966: done queuing things up, now waiting for results queue to drain 41175 1727204657.86968: waiting for pending results... 41175 1727204657.87508: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204657.87514: in run() - task 12b410aa-8751-f070-39c4-00000000006a 41175 1727204657.87519: variable 'ansible_search_path' from source: unknown 41175 1727204657.87523: variable 'ansible_search_path' from source: unknown 41175 1727204657.87526: calling self._execute() 41175 1727204657.87551: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204657.87566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204657.87584: variable 'omit' from source: magic vars 41175 1727204657.88037: variable 'ansible_distribution_major_version' from source: facts 41175 1727204657.88054: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204657.88233: variable 'network_state' from source: role '' defaults 41175 1727204657.88255: Evaluated conditional (network_state != {}): False 41175 1727204657.88260: when evaluation is False, skipping this task 41175 1727204657.88265: _execute() done 41175 1727204657.88268: dumping result to json 41175 1727204657.88271: done dumping result, returning 41175 1727204657.88279: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-f070-39c4-00000000006a] 41175 1727204657.88287: sending task result for task 12b410aa-8751-f070-39c4-00000000006a 41175 1727204657.88394: done sending task result for task 12b410aa-8751-f070-39c4-00000000006a 41175 1727204657.88397: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204657.88456: no more pending results, returning what we have 41175 1727204657.88461: results queue empty 41175 1727204657.88462: checking for any_errors_fatal 41175 1727204657.88472: done checking for any_errors_fatal 41175 1727204657.88473: checking for max_fail_percentage 41175 1727204657.88475: done checking for max_fail_percentage 41175 1727204657.88476: checking to see if all hosts have failed and the running result is not ok 41175 1727204657.88478: done checking to see if all hosts have failed 41175 1727204657.88479: getting the remaining hosts for this loop 41175 1727204657.88481: done getting the remaining hosts for this loop 41175 1727204657.88486: getting the next task for host managed-node3 41175 1727204657.88497: done getting next task for host managed-node3 41175 1727204657.88693: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204657.88697: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204657.88718: getting variables 41175 1727204657.88720: in VariableManager get_vars() 41175 1727204657.88765: Calling all_inventory to load vars for managed-node3 41175 1727204657.88769: Calling groups_inventory to load vars for managed-node3 41175 1727204657.88771: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204657.88783: Calling all_plugins_play to load vars for managed-node3 41175 1727204657.88786: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204657.88801: Calling groups_plugins_play to load vars for managed-node3 41175 1727204657.91139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204657.94297: done with get_vars() 41175 1727204657.94337: done getting variables 41175 1727204657.94423: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:17 -0400 (0:00:00.079) 0:00:25.083 ***** 41175 1727204657.94471: entering _queue_task() for managed-node3/fail 41175 1727204657.94879: worker is 1 (out of 1 available) 41175 1727204657.95002: exiting _queue_task() for managed-node3/fail 41175 1727204657.95015: done queuing things up, now waiting for results queue to drain 41175 1727204657.95016: waiting for pending results... 41175 1727204657.95274: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204657.95477: in run() - task 12b410aa-8751-f070-39c4-00000000006b 41175 1727204657.95482: variable 'ansible_search_path' from source: unknown 41175 1727204657.95486: variable 'ansible_search_path' from source: unknown 41175 1727204657.95491: calling self._execute() 41175 1727204657.95595: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204657.95603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204657.95615: variable 'omit' from source: magic vars 41175 1727204657.96105: variable 'ansible_distribution_major_version' from source: facts 41175 1727204657.96122: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204657.96415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204657.99164: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204657.99549: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204657.99583: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204657.99615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204657.99641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204657.99711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204657.99740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204657.99762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204657.99800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204657.99813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204657.99900: variable 'ansible_distribution_major_version' from source: facts 41175 1727204657.99915: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41175 1727204658.00018: variable 'ansible_distribution' from source: facts 41175 1727204658.00025: variable '__network_rh_distros' from source: role '' defaults 41175 1727204658.00034: Evaluated conditional (ansible_distribution in __network_rh_distros): False 41175 1727204658.00038: when evaluation is False, skipping this task 41175 1727204658.00041: _execute() done 41175 1727204658.00046: dumping result to json 41175 1727204658.00049: done dumping result, returning 41175 1727204658.00060: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-f070-39c4-00000000006b] 41175 1727204658.00063: sending task result for task 12b410aa-8751-f070-39c4-00000000006b 41175 1727204658.00164: done sending task result for task 12b410aa-8751-f070-39c4-00000000006b 41175 1727204658.00167: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 41175 1727204658.00221: no more pending results, returning what we have 41175 1727204658.00224: results queue empty 41175 1727204658.00225: checking for any_errors_fatal 41175 1727204658.00233: done checking for any_errors_fatal 41175 1727204658.00233: checking for max_fail_percentage 41175 1727204658.00236: done checking for max_fail_percentage 41175 1727204658.00237: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.00238: done checking to see if all hosts have failed 41175 1727204658.00239: getting the remaining hosts for this loop 41175 1727204658.00240: done getting the remaining hosts for this loop 41175 1727204658.00245: getting the next task for host managed-node3 41175 1727204658.00253: done getting next task for host managed-node3 41175 1727204658.00257: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204658.00260: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.00281: getting variables 41175 1727204658.00283: in VariableManager get_vars() 41175 1727204658.00331: Calling all_inventory to load vars for managed-node3 41175 1727204658.00334: Calling groups_inventory to load vars for managed-node3 41175 1727204658.00337: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.00349: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.00352: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.00355: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.01784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.03959: done with get_vars() 41175 1727204658.03995: done getting variables 41175 1727204658.04066: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.098) 0:00:25.181 ***** 41175 1727204658.04273: entering _queue_task() for managed-node3/dnf 41175 1727204658.04895: worker is 1 (out of 1 available) 41175 1727204658.04912: exiting _queue_task() for managed-node3/dnf 41175 1727204658.04926: done queuing things up, now waiting for results queue to drain 41175 1727204658.04928: waiting for pending results... 41175 1727204658.05720: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204658.06098: in run() - task 12b410aa-8751-f070-39c4-00000000006c 41175 1727204658.06102: variable 'ansible_search_path' from source: unknown 41175 1727204658.06105: variable 'ansible_search_path' from source: unknown 41175 1727204658.06127: calling self._execute() 41175 1727204658.06350: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.06496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.06500: variable 'omit' from source: magic vars 41175 1727204658.06948: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.06973: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.07263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204658.11029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204658.11135: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204658.11212: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204658.11363: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204658.11367: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204658.11671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.11675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.11678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.11806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.11830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.12207: variable 'ansible_distribution' from source: facts 41175 1727204658.12325: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.12329: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41175 1727204658.12546: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204658.12868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.13013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.13051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.13138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.13217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.13362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.13399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.13505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.13564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.13798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.13802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.13804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.13915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.13972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.14037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.14284: variable 'network_connections' from source: task vars 41175 1727204658.14308: variable 'interface' from source: set_fact 41175 1727204658.14412: variable 'interface' from source: set_fact 41175 1727204658.14429: variable 'interface' from source: set_fact 41175 1727204658.14514: variable 'interface' from source: set_fact 41175 1727204658.14622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204658.14866: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204658.14924: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204658.14967: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204658.15012: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204658.15073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204658.15129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204658.15185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.15232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204658.15355: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204658.15807: variable 'network_connections' from source: task vars 41175 1727204658.15813: variable 'interface' from source: set_fact 41175 1727204658.15904: variable 'interface' from source: set_fact 41175 1727204658.15912: variable 'interface' from source: set_fact 41175 1727204658.16001: variable 'interface' from source: set_fact 41175 1727204658.16051: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204658.16060: when evaluation is False, skipping this task 41175 1727204658.16063: _execute() done 41175 1727204658.16066: dumping result to json 41175 1727204658.16085: done dumping result, returning 41175 1727204658.16088: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-00000000006c] 41175 1727204658.16093: sending task result for task 12b410aa-8751-f070-39c4-00000000006c 41175 1727204658.16269: done sending task result for task 12b410aa-8751-f070-39c4-00000000006c 41175 1727204658.16272: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204658.16357: no more pending results, returning what we have 41175 1727204658.16361: results queue empty 41175 1727204658.16363: checking for any_errors_fatal 41175 1727204658.16370: done checking for any_errors_fatal 41175 1727204658.16371: checking for max_fail_percentage 41175 1727204658.16374: done checking for max_fail_percentage 41175 1727204658.16375: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.16376: done checking to see if all hosts have failed 41175 1727204658.16377: getting the remaining hosts for this loop 41175 1727204658.16379: done getting the remaining hosts for this loop 41175 1727204658.16384: getting the next task for host managed-node3 41175 1727204658.16395: done getting next task for host managed-node3 41175 1727204658.16399: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204658.16402: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.16420: getting variables 41175 1727204658.16422: in VariableManager get_vars() 41175 1727204658.16465: Calling all_inventory to load vars for managed-node3 41175 1727204658.16468: Calling groups_inventory to load vars for managed-node3 41175 1727204658.16471: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.16482: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.16484: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.16488: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.18234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.20685: done with get_vars() 41175 1727204658.20710: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204658.20777: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.165) 0:00:25.346 ***** 41175 1727204658.20806: entering _queue_task() for managed-node3/yum 41175 1727204658.21093: worker is 1 (out of 1 available) 41175 1727204658.21109: exiting _queue_task() for managed-node3/yum 41175 1727204658.21126: done queuing things up, now waiting for results queue to drain 41175 1727204658.21128: waiting for pending results... 41175 1727204658.21411: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204658.21596: in run() - task 12b410aa-8751-f070-39c4-00000000006d 41175 1727204658.21601: variable 'ansible_search_path' from source: unknown 41175 1727204658.21604: variable 'ansible_search_path' from source: unknown 41175 1727204658.21621: calling self._execute() 41175 1727204658.21732: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.21748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.21766: variable 'omit' from source: magic vars 41175 1727204658.22241: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.22262: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.22594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204658.25192: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204658.25290: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204658.25346: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204658.25406: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204658.25444: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204658.25552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.25600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.25641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.25708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.25737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.25861: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.25890: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41175 1727204658.25994: when evaluation is False, skipping this task 41175 1727204658.25998: _execute() done 41175 1727204658.26000: dumping result to json 41175 1727204658.26003: done dumping result, returning 41175 1727204658.26007: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-00000000006d] 41175 1727204658.26010: sending task result for task 12b410aa-8751-f070-39c4-00000000006d 41175 1727204658.26107: done sending task result for task 12b410aa-8751-f070-39c4-00000000006d 41175 1727204658.26111: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41175 1727204658.26185: no more pending results, returning what we have 41175 1727204658.26191: results queue empty 41175 1727204658.26193: checking for any_errors_fatal 41175 1727204658.26201: done checking for any_errors_fatal 41175 1727204658.26202: checking for max_fail_percentage 41175 1727204658.26204: done checking for max_fail_percentage 41175 1727204658.26205: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.26205: done checking to see if all hosts have failed 41175 1727204658.26206: getting the remaining hosts for this loop 41175 1727204658.26208: done getting the remaining hosts for this loop 41175 1727204658.26213: getting the next task for host managed-node3 41175 1727204658.26222: done getting next task for host managed-node3 41175 1727204658.26228: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204658.26231: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.26252: getting variables 41175 1727204658.26253: in VariableManager get_vars() 41175 1727204658.26483: Calling all_inventory to load vars for managed-node3 41175 1727204658.26487: Calling groups_inventory to load vars for managed-node3 41175 1727204658.26493: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.26505: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.26508: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.26512: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.28612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.31631: done with get_vars() 41175 1727204658.31683: done getting variables 41175 1727204658.31765: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.109) 0:00:25.456 ***** 41175 1727204658.31809: entering _queue_task() for managed-node3/fail 41175 1727204658.32320: worker is 1 (out of 1 available) 41175 1727204658.32336: exiting _queue_task() for managed-node3/fail 41175 1727204658.32347: done queuing things up, now waiting for results queue to drain 41175 1727204658.32351: waiting for pending results... 41175 1727204658.32711: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204658.32783: in run() - task 12b410aa-8751-f070-39c4-00000000006e 41175 1727204658.32816: variable 'ansible_search_path' from source: unknown 41175 1727204658.32830: variable 'ansible_search_path' from source: unknown 41175 1727204658.32880: calling self._execute() 41175 1727204658.33002: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.33023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.33042: variable 'omit' from source: magic vars 41175 1727204658.33529: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.33551: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.33726: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204658.34096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204658.36759: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204658.37299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204658.37351: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204658.37407: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204658.37444: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204658.37553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.37610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.37649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.37714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.37737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.37806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.37844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.37916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.37944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.37967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.38032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.38070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.38106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.38245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.38248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.38450: variable 'network_connections' from source: task vars 41175 1727204658.38478: variable 'interface' from source: set_fact 41175 1727204658.38587: variable 'interface' from source: set_fact 41175 1727204658.38606: variable 'interface' from source: set_fact 41175 1727204658.38704: variable 'interface' from source: set_fact 41175 1727204658.38895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204658.39050: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204658.39102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204658.39154: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204658.39210: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204658.39278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204658.39314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204658.39360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.39403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204658.39557: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204658.39873: variable 'network_connections' from source: task vars 41175 1727204658.39895: variable 'interface' from source: set_fact 41175 1727204658.39976: variable 'interface' from source: set_fact 41175 1727204658.39995: variable 'interface' from source: set_fact 41175 1727204658.40075: variable 'interface' from source: set_fact 41175 1727204658.40143: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204658.40221: when evaluation is False, skipping this task 41175 1727204658.40224: _execute() done 41175 1727204658.40227: dumping result to json 41175 1727204658.40229: done dumping result, returning 41175 1727204658.40231: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-00000000006e] 41175 1727204658.40241: sending task result for task 12b410aa-8751-f070-39c4-00000000006e skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204658.40393: no more pending results, returning what we have 41175 1727204658.40397: results queue empty 41175 1727204658.40398: checking for any_errors_fatal 41175 1727204658.40407: done checking for any_errors_fatal 41175 1727204658.40408: checking for max_fail_percentage 41175 1727204658.40410: done checking for max_fail_percentage 41175 1727204658.40411: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.40412: done checking to see if all hosts have failed 41175 1727204658.40413: getting the remaining hosts for this loop 41175 1727204658.40415: done getting the remaining hosts for this loop 41175 1727204658.40420: getting the next task for host managed-node3 41175 1727204658.40429: done getting next task for host managed-node3 41175 1727204658.40433: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41175 1727204658.40439: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.40461: getting variables 41175 1727204658.40463: in VariableManager get_vars() 41175 1727204658.40739: Calling all_inventory to load vars for managed-node3 41175 1727204658.40743: Calling groups_inventory to load vars for managed-node3 41175 1727204658.40746: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.40760: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.40764: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.40768: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.41407: done sending task result for task 12b410aa-8751-f070-39c4-00000000006e 41175 1727204658.41411: WORKER PROCESS EXITING 41175 1727204658.43623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.47021: done with get_vars() 41175 1727204658.47073: done getting variables 41175 1727204658.47173: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.154) 0:00:25.611 ***** 41175 1727204658.47232: entering _queue_task() for managed-node3/package 41175 1727204658.47725: worker is 1 (out of 1 available) 41175 1727204658.47740: exiting _queue_task() for managed-node3/package 41175 1727204658.47753: done queuing things up, now waiting for results queue to drain 41175 1727204658.47755: waiting for pending results... 41175 1727204658.48072: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41175 1727204658.48258: in run() - task 12b410aa-8751-f070-39c4-00000000006f 41175 1727204658.48281: variable 'ansible_search_path' from source: unknown 41175 1727204658.48292: variable 'ansible_search_path' from source: unknown 41175 1727204658.48346: calling self._execute() 41175 1727204658.48461: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.48480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.48503: variable 'omit' from source: magic vars 41175 1727204658.49012: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.49033: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.49315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204658.49662: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204658.49752: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204658.49776: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204658.49872: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204658.50030: variable 'network_packages' from source: role '' defaults 41175 1727204658.50195: variable '__network_provider_setup' from source: role '' defaults 41175 1727204658.50295: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204658.50317: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204658.50333: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204658.50430: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204658.50691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204658.53106: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204658.53202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204658.53260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204658.53309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204658.53353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204658.53498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.53520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.53563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.53632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.53656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.53732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.53769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.53810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.53874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.53900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.54240: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204658.54415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.54455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.54503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.54562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.54694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.54725: variable 'ansible_python' from source: facts 41175 1727204658.54767: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204658.54885: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204658.54999: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204658.55210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.55252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.55295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.55351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.55383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.55453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.55580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.55584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.55616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.55640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.55849: variable 'network_connections' from source: task vars 41175 1727204658.55861: variable 'interface' from source: set_fact 41175 1727204658.55995: variable 'interface' from source: set_fact 41175 1727204658.56017: variable 'interface' from source: set_fact 41175 1727204658.56159: variable 'interface' from source: set_fact 41175 1727204658.56298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204658.56319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204658.56376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.56422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204658.56561: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204658.56918: variable 'network_connections' from source: task vars 41175 1727204658.56929: variable 'interface' from source: set_fact 41175 1727204658.57056: variable 'interface' from source: set_fact 41175 1727204658.57073: variable 'interface' from source: set_fact 41175 1727204658.57203: variable 'interface' from source: set_fact 41175 1727204658.57281: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204658.57406: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204658.57875: variable 'network_connections' from source: task vars 41175 1727204658.57887: variable 'interface' from source: set_fact 41175 1727204658.57977: variable 'interface' from source: set_fact 41175 1727204658.58072: variable 'interface' from source: set_fact 41175 1727204658.58075: variable 'interface' from source: set_fact 41175 1727204658.58123: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204658.58237: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204658.58629: variable 'network_connections' from source: task vars 41175 1727204658.58634: variable 'interface' from source: set_fact 41175 1727204658.58691: variable 'interface' from source: set_fact 41175 1727204658.58697: variable 'interface' from source: set_fact 41175 1727204658.58756: variable 'interface' from source: set_fact 41175 1727204658.58840: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204658.58888: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204658.58897: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204658.58950: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204658.59138: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204658.59716: variable 'network_connections' from source: task vars 41175 1727204658.59723: variable 'interface' from source: set_fact 41175 1727204658.59777: variable 'interface' from source: set_fact 41175 1727204658.59785: variable 'interface' from source: set_fact 41175 1727204658.59843: variable 'interface' from source: set_fact 41175 1727204658.59862: variable 'ansible_distribution' from source: facts 41175 1727204658.59865: variable '__network_rh_distros' from source: role '' defaults 41175 1727204658.59872: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.59896: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204658.60038: variable 'ansible_distribution' from source: facts 41175 1727204658.60041: variable '__network_rh_distros' from source: role '' defaults 41175 1727204658.60048: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.60055: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204658.60222: variable 'ansible_distribution' from source: facts 41175 1727204658.60226: variable '__network_rh_distros' from source: role '' defaults 41175 1727204658.60229: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.60294: variable 'network_provider' from source: set_fact 41175 1727204658.60298: variable 'ansible_facts' from source: unknown 41175 1727204658.61213: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41175 1727204658.61220: when evaluation is False, skipping this task 41175 1727204658.61233: _execute() done 41175 1727204658.61262: dumping result to json 41175 1727204658.61266: done dumping result, returning 41175 1727204658.61276: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-f070-39c4-00000000006f] 41175 1727204658.61282: sending task result for task 12b410aa-8751-f070-39c4-00000000006f 41175 1727204658.61361: done sending task result for task 12b410aa-8751-f070-39c4-00000000006f 41175 1727204658.61364: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41175 1727204658.61426: no more pending results, returning what we have 41175 1727204658.61430: results queue empty 41175 1727204658.61432: checking for any_errors_fatal 41175 1727204658.61445: done checking for any_errors_fatal 41175 1727204658.61446: checking for max_fail_percentage 41175 1727204658.61447: done checking for max_fail_percentage 41175 1727204658.61448: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.61449: done checking to see if all hosts have failed 41175 1727204658.61450: getting the remaining hosts for this loop 41175 1727204658.61452: done getting the remaining hosts for this loop 41175 1727204658.61457: getting the next task for host managed-node3 41175 1727204658.61465: done getting next task for host managed-node3 41175 1727204658.61469: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204658.61472: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.61507: getting variables 41175 1727204658.61509: in VariableManager get_vars() 41175 1727204658.61570: Calling all_inventory to load vars for managed-node3 41175 1727204658.61574: Calling groups_inventory to load vars for managed-node3 41175 1727204658.61577: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.61698: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.61706: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.61712: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.64584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.66928: done with get_vars() 41175 1727204658.66953: done getting variables 41175 1727204658.67011: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.198) 0:00:25.809 ***** 41175 1727204658.67040: entering _queue_task() for managed-node3/package 41175 1727204658.67317: worker is 1 (out of 1 available) 41175 1727204658.67333: exiting _queue_task() for managed-node3/package 41175 1727204658.67346: done queuing things up, now waiting for results queue to drain 41175 1727204658.67348: waiting for pending results... 41175 1727204658.67556: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204658.67672: in run() - task 12b410aa-8751-f070-39c4-000000000070 41175 1727204658.67689: variable 'ansible_search_path' from source: unknown 41175 1727204658.67695: variable 'ansible_search_path' from source: unknown 41175 1727204658.67728: calling self._execute() 41175 1727204658.68012: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.68016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.68020: variable 'omit' from source: magic vars 41175 1727204658.68501: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.68506: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.68532: variable 'network_state' from source: role '' defaults 41175 1727204658.68550: Evaluated conditional (network_state != {}): False 41175 1727204658.68559: when evaluation is False, skipping this task 41175 1727204658.68566: _execute() done 41175 1727204658.68573: dumping result to json 41175 1727204658.68582: done dumping result, returning 41175 1727204658.68598: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-f070-39c4-000000000070] 41175 1727204658.68625: sending task result for task 12b410aa-8751-f070-39c4-000000000070 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204658.68876: no more pending results, returning what we have 41175 1727204658.68881: results queue empty 41175 1727204658.68883: checking for any_errors_fatal 41175 1727204658.68894: done checking for any_errors_fatal 41175 1727204658.68895: checking for max_fail_percentage 41175 1727204658.68897: done checking for max_fail_percentage 41175 1727204658.68898: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.68899: done checking to see if all hosts have failed 41175 1727204658.68900: getting the remaining hosts for this loop 41175 1727204658.68902: done getting the remaining hosts for this loop 41175 1727204658.68907: getting the next task for host managed-node3 41175 1727204658.68915: done getting next task for host managed-node3 41175 1727204658.68920: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204658.68924: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.68950: getting variables 41175 1727204658.68952: in VariableManager get_vars() 41175 1727204658.69122: Calling all_inventory to load vars for managed-node3 41175 1727204658.69126: Calling groups_inventory to load vars for managed-node3 41175 1727204658.69129: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.69144: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.69147: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.69152: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.69875: done sending task result for task 12b410aa-8751-f070-39c4-000000000070 41175 1727204658.69878: WORKER PROCESS EXITING 41175 1727204658.71886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.75075: done with get_vars() 41175 1727204658.75133: done getting variables 41175 1727204658.75186: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.081) 0:00:25.891 ***** 41175 1727204658.75224: entering _queue_task() for managed-node3/package 41175 1727204658.75510: worker is 1 (out of 1 available) 41175 1727204658.75526: exiting _queue_task() for managed-node3/package 41175 1727204658.75540: done queuing things up, now waiting for results queue to drain 41175 1727204658.75542: waiting for pending results... 41175 1727204658.75755: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204658.75869: in run() - task 12b410aa-8751-f070-39c4-000000000071 41175 1727204658.75886: variable 'ansible_search_path' from source: unknown 41175 1727204658.75890: variable 'ansible_search_path' from source: unknown 41175 1727204658.75929: calling self._execute() 41175 1727204658.76016: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.76025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.76036: variable 'omit' from source: magic vars 41175 1727204658.76372: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.76383: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.76494: variable 'network_state' from source: role '' defaults 41175 1727204658.76503: Evaluated conditional (network_state != {}): False 41175 1727204658.76507: when evaluation is False, skipping this task 41175 1727204658.76510: _execute() done 41175 1727204658.76515: dumping result to json 41175 1727204658.76522: done dumping result, returning 41175 1727204658.76536: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-f070-39c4-000000000071] 41175 1727204658.76539: sending task result for task 12b410aa-8751-f070-39c4-000000000071 41175 1727204658.76643: done sending task result for task 12b410aa-8751-f070-39c4-000000000071 41175 1727204658.76646: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204658.76701: no more pending results, returning what we have 41175 1727204658.76705: results queue empty 41175 1727204658.76707: checking for any_errors_fatal 41175 1727204658.76717: done checking for any_errors_fatal 41175 1727204658.76718: checking for max_fail_percentage 41175 1727204658.76720: done checking for max_fail_percentage 41175 1727204658.76721: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.76722: done checking to see if all hosts have failed 41175 1727204658.76723: getting the remaining hosts for this loop 41175 1727204658.76725: done getting the remaining hosts for this loop 41175 1727204658.76729: getting the next task for host managed-node3 41175 1727204658.76738: done getting next task for host managed-node3 41175 1727204658.76742: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204658.76745: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.76767: getting variables 41175 1727204658.76769: in VariableManager get_vars() 41175 1727204658.76821: Calling all_inventory to load vars for managed-node3 41175 1727204658.76825: Calling groups_inventory to load vars for managed-node3 41175 1727204658.76827: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.76838: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.76841: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.76845: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.78659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.80467: done with get_vars() 41175 1727204658.80498: done getting variables 41175 1727204658.80555: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.053) 0:00:25.944 ***** 41175 1727204658.80585: entering _queue_task() for managed-node3/service 41175 1727204658.80871: worker is 1 (out of 1 available) 41175 1727204658.80888: exiting _queue_task() for managed-node3/service 41175 1727204658.80903: done queuing things up, now waiting for results queue to drain 41175 1727204658.80905: waiting for pending results... 41175 1727204658.81118: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204658.81232: in run() - task 12b410aa-8751-f070-39c4-000000000072 41175 1727204658.81248: variable 'ansible_search_path' from source: unknown 41175 1727204658.81252: variable 'ansible_search_path' from source: unknown 41175 1727204658.81286: calling self._execute() 41175 1727204658.81376: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.81383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.81395: variable 'omit' from source: magic vars 41175 1727204658.81734: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.81745: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.81853: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204658.82038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204658.84697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204658.84702: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204658.84752: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204658.84794: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204658.84830: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204658.84937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.84962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.85043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.85047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.85095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.85130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.85163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.85192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.85245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.85263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.85496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.85500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.85503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.85505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.85508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.85668: variable 'network_connections' from source: task vars 41175 1727204658.85684: variable 'interface' from source: set_fact 41175 1727204658.85777: variable 'interface' from source: set_fact 41175 1727204658.85787: variable 'interface' from source: set_fact 41175 1727204658.85863: variable 'interface' from source: set_fact 41175 1727204658.85963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204658.86181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204658.86228: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204658.86265: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204658.86308: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204658.86364: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204658.86396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204658.86424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.86456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204658.86528: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204658.86857: variable 'network_connections' from source: task vars 41175 1727204658.86863: variable 'interface' from source: set_fact 41175 1727204658.86946: variable 'interface' from source: set_fact 41175 1727204658.86950: variable 'interface' from source: set_fact 41175 1727204658.87030: variable 'interface' from source: set_fact 41175 1727204658.87069: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204658.87073: when evaluation is False, skipping this task 41175 1727204658.87076: _execute() done 41175 1727204658.87080: dumping result to json 41175 1727204658.87085: done dumping result, returning 41175 1727204658.87099: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-000000000072] 41175 1727204658.87109: sending task result for task 12b410aa-8751-f070-39c4-000000000072 41175 1727204658.87235: done sending task result for task 12b410aa-8751-f070-39c4-000000000072 41175 1727204658.87238: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204658.87320: no more pending results, returning what we have 41175 1727204658.87323: results queue empty 41175 1727204658.87325: checking for any_errors_fatal 41175 1727204658.87332: done checking for any_errors_fatal 41175 1727204658.87333: checking for max_fail_percentage 41175 1727204658.87335: done checking for max_fail_percentage 41175 1727204658.87336: checking to see if all hosts have failed and the running result is not ok 41175 1727204658.87337: done checking to see if all hosts have failed 41175 1727204658.87338: getting the remaining hosts for this loop 41175 1727204658.87339: done getting the remaining hosts for this loop 41175 1727204658.87344: getting the next task for host managed-node3 41175 1727204658.87351: done getting next task for host managed-node3 41175 1727204658.87355: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204658.87358: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204658.87379: getting variables 41175 1727204658.87381: in VariableManager get_vars() 41175 1727204658.87426: Calling all_inventory to load vars for managed-node3 41175 1727204658.87430: Calling groups_inventory to load vars for managed-node3 41175 1727204658.87432: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204658.87444: Calling all_plugins_play to load vars for managed-node3 41175 1727204658.87447: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204658.87450: Calling groups_plugins_play to load vars for managed-node3 41175 1727204658.89724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204658.92903: done with get_vars() 41175 1727204658.92943: done getting variables 41175 1727204658.93021: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.124) 0:00:26.069 ***** 41175 1727204658.93063: entering _queue_task() for managed-node3/service 41175 1727204658.93467: worker is 1 (out of 1 available) 41175 1727204658.93483: exiting _queue_task() for managed-node3/service 41175 1727204658.93603: done queuing things up, now waiting for results queue to drain 41175 1727204658.93605: waiting for pending results... 41175 1727204658.94017: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204658.94023: in run() - task 12b410aa-8751-f070-39c4-000000000073 41175 1727204658.94027: variable 'ansible_search_path' from source: unknown 41175 1727204658.94031: variable 'ansible_search_path' from source: unknown 41175 1727204658.94073: calling self._execute() 41175 1727204658.94181: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204658.94190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204658.94207: variable 'omit' from source: magic vars 41175 1727204658.94678: variable 'ansible_distribution_major_version' from source: facts 41175 1727204658.94693: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204658.94927: variable 'network_provider' from source: set_fact 41175 1727204658.94933: variable 'network_state' from source: role '' defaults 41175 1727204658.94946: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41175 1727204658.94982: variable 'omit' from source: magic vars 41175 1727204658.95036: variable 'omit' from source: magic vars 41175 1727204658.95070: variable 'network_service_name' from source: role '' defaults 41175 1727204658.95200: variable 'network_service_name' from source: role '' defaults 41175 1727204658.95316: variable '__network_provider_setup' from source: role '' defaults 41175 1727204658.95326: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204658.95418: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204658.95422: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204658.95529: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204658.95821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204658.98504: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204658.98508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204658.98553: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204658.98602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204658.98634: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204658.98739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.98774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.98834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.98866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.98942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.98949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.98980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.99017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.99068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.99086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.99410: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204658.99578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204658.99710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204658.99714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204658.99717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204658.99720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204658.99835: variable 'ansible_python' from source: facts 41175 1727204658.99864: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204658.99972: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204659.00073: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204659.00243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204659.00271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204659.00303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204659.00360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204659.00377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204659.00441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204659.00469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204659.00500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204659.00557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204659.00579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204659.00765: variable 'network_connections' from source: task vars 41175 1727204659.00774: variable 'interface' from source: set_fact 41175 1727204659.00907: variable 'interface' from source: set_fact 41175 1727204659.00915: variable 'interface' from source: set_fact 41175 1727204659.00985: variable 'interface' from source: set_fact 41175 1727204659.01169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204659.01447: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204659.01694: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204659.01698: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204659.01701: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204659.01703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204659.01718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204659.01764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204659.01806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204659.01863: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204659.02261: variable 'network_connections' from source: task vars 41175 1727204659.02265: variable 'interface' from source: set_fact 41175 1727204659.02357: variable 'interface' from source: set_fact 41175 1727204659.02371: variable 'interface' from source: set_fact 41175 1727204659.02463: variable 'interface' from source: set_fact 41175 1727204659.02563: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204659.02666: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204659.03069: variable 'network_connections' from source: task vars 41175 1727204659.03075: variable 'interface' from source: set_fact 41175 1727204659.03169: variable 'interface' from source: set_fact 41175 1727204659.03176: variable 'interface' from source: set_fact 41175 1727204659.03265: variable 'interface' from source: set_fact 41175 1727204659.03320: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204659.03414: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204659.03888: variable 'network_connections' from source: task vars 41175 1727204659.03895: variable 'interface' from source: set_fact 41175 1727204659.03926: variable 'interface' from source: set_fact 41175 1727204659.03998: variable 'interface' from source: set_fact 41175 1727204659.04023: variable 'interface' from source: set_fact 41175 1727204659.04119: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204659.04200: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204659.04210: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204659.04284: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204659.04603: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204659.05060: variable 'network_connections' from source: task vars 41175 1727204659.05065: variable 'interface' from source: set_fact 41175 1727204659.05125: variable 'interface' from source: set_fact 41175 1727204659.05131: variable 'interface' from source: set_fact 41175 1727204659.05181: variable 'interface' from source: set_fact 41175 1727204659.05198: variable 'ansible_distribution' from source: facts 41175 1727204659.05203: variable '__network_rh_distros' from source: role '' defaults 41175 1727204659.05209: variable 'ansible_distribution_major_version' from source: facts 41175 1727204659.05234: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204659.05382: variable 'ansible_distribution' from source: facts 41175 1727204659.05386: variable '__network_rh_distros' from source: role '' defaults 41175 1727204659.05394: variable 'ansible_distribution_major_version' from source: facts 41175 1727204659.05401: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204659.05554: variable 'ansible_distribution' from source: facts 41175 1727204659.05558: variable '__network_rh_distros' from source: role '' defaults 41175 1727204659.05564: variable 'ansible_distribution_major_version' from source: facts 41175 1727204659.05596: variable 'network_provider' from source: set_fact 41175 1727204659.05619: variable 'omit' from source: magic vars 41175 1727204659.05651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204659.05676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204659.05696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204659.05712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204659.05726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204659.05756: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204659.05761: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204659.05764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204659.05857: Set connection var ansible_shell_executable to /bin/sh 41175 1727204659.05860: Set connection var ansible_shell_type to sh 41175 1727204659.05866: Set connection var ansible_pipelining to False 41175 1727204659.05878: Set connection var ansible_timeout to 10 41175 1727204659.05881: Set connection var ansible_connection to ssh 41175 1727204659.05888: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204659.05912: variable 'ansible_shell_executable' from source: unknown 41175 1727204659.05915: variable 'ansible_connection' from source: unknown 41175 1727204659.05922: variable 'ansible_module_compression' from source: unknown 41175 1727204659.05924: variable 'ansible_shell_type' from source: unknown 41175 1727204659.05929: variable 'ansible_shell_executable' from source: unknown 41175 1727204659.05934: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204659.05942: variable 'ansible_pipelining' from source: unknown 41175 1727204659.05945: variable 'ansible_timeout' from source: unknown 41175 1727204659.05949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204659.06045: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204659.06054: variable 'omit' from source: magic vars 41175 1727204659.06066: starting attempt loop 41175 1727204659.06069: running the handler 41175 1727204659.06138: variable 'ansible_facts' from source: unknown 41175 1727204659.06875: _low_level_execute_command(): starting 41175 1727204659.06883: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204659.07645: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204659.07670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204659.07740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204659.09484: stdout chunk (state=3): >>>/root <<< 41175 1727204659.09611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204659.09657: stderr chunk (state=3): >>><<< 41175 1727204659.09660: stdout chunk (state=3): >>><<< 41175 1727204659.09673: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204659.09686: _low_level_execute_command(): starting 41175 1727204659.09710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366 `" && echo ansible-tmp-1727204659.0967762-42377-56613462612366="` echo /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366 `" ) && sleep 0' 41175 1727204659.10159: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204659.10162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204659.10165: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.10169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204659.10172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.10223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204659.10227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204659.10269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204659.12275: stdout chunk (state=3): >>>ansible-tmp-1727204659.0967762-42377-56613462612366=/root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366 <<< 41175 1727204659.12510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204659.12514: stdout chunk (state=3): >>><<< 41175 1727204659.12518: stderr chunk (state=3): >>><<< 41175 1727204659.12552: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204659.0967762-42377-56613462612366=/root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204659.12579: variable 'ansible_module_compression' from source: unknown 41175 1727204659.12622: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41175 1727204659.12677: variable 'ansible_facts' from source: unknown 41175 1727204659.12822: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/AnsiballZ_systemd.py 41175 1727204659.12943: Sending initial data 41175 1727204659.12947: Sent initial data (155 bytes) 41175 1727204659.13608: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204659.13663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204659.13683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204659.13687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.13787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204659.13819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204659.13904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204659.15743: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204659.15749: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204659.15832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpe8vo3ufv" to remote "/root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/AnsiballZ_systemd.py" <<< 41175 1727204659.15841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpe8vo3ufv /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/AnsiballZ_systemd.py <<< 41175 1727204659.18605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204659.18717: stderr chunk (state=3): >>><<< 41175 1727204659.18748: stdout chunk (state=3): >>><<< 41175 1727204659.18793: done transferring module to remote 41175 1727204659.18805: _low_level_execute_command(): starting 41175 1727204659.18819: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/ /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/AnsiballZ_systemd.py && sleep 0' 41175 1727204659.19523: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204659.19528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204659.19533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204659.19535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.19537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204659.19632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.19652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204659.19662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204659.19736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204659.21602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204659.21660: stderr chunk (state=3): >>><<< 41175 1727204659.21664: stdout chunk (state=3): >>><<< 41175 1727204659.21707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204659.21710: _low_level_execute_command(): starting 41175 1727204659.21714: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/AnsiballZ_systemd.py && sleep 0' 41175 1727204659.22361: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.22365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.22448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204659.22452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204659.22492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204659.55337: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11894784", "MemoryAvailable": "infinity", "CPUUsageNSec": "1910145000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 41175 1727204659.55368: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41175 1727204659.57514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204659.57627: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 41175 1727204659.57651: stderr chunk (state=3): >>><<< 41175 1727204659.57671: stdout chunk (state=3): >>><<< 41175 1727204659.57705: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11894784", "MemoryAvailable": "infinity", "CPUUsageNSec": "1910145000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204659.58209: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204659.58213: _low_level_execute_command(): starting 41175 1727204659.58216: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204659.0967762-42377-56613462612366/ > /dev/null 2>&1 && sleep 0' 41175 1727204659.58964: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204659.58981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204659.59080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204659.59129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204659.59146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204659.59247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204659.59540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204659.61559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204659.61572: stdout chunk (state=3): >>><<< 41175 1727204659.61586: stderr chunk (state=3): >>><<< 41175 1727204659.61611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204659.61627: handler run complete 41175 1727204659.61715: attempt loop complete, returning result 41175 1727204659.61723: _execute() done 41175 1727204659.61729: dumping result to json 41175 1727204659.61753: done dumping result, returning 41175 1727204659.61995: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-f070-39c4-000000000073] 41175 1727204659.61999: sending task result for task 12b410aa-8751-f070-39c4-000000000073 41175 1727204659.62166: done sending task result for task 12b410aa-8751-f070-39c4-000000000073 41175 1727204659.62170: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204659.62243: no more pending results, returning what we have 41175 1727204659.62247: results queue empty 41175 1727204659.62248: checking for any_errors_fatal 41175 1727204659.62256: done checking for any_errors_fatal 41175 1727204659.62257: checking for max_fail_percentage 41175 1727204659.62259: done checking for max_fail_percentage 41175 1727204659.62260: checking to see if all hosts have failed and the running result is not ok 41175 1727204659.62261: done checking to see if all hosts have failed 41175 1727204659.62261: getting the remaining hosts for this loop 41175 1727204659.62263: done getting the remaining hosts for this loop 41175 1727204659.62268: getting the next task for host managed-node3 41175 1727204659.62275: done getting next task for host managed-node3 41175 1727204659.62279: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204659.62282: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204659.62297: getting variables 41175 1727204659.62299: in VariableManager get_vars() 41175 1727204659.62351: Calling all_inventory to load vars for managed-node3 41175 1727204659.62355: Calling groups_inventory to load vars for managed-node3 41175 1727204659.62357: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204659.62369: Calling all_plugins_play to load vars for managed-node3 41175 1727204659.62372: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204659.62376: Calling groups_plugins_play to load vars for managed-node3 41175 1727204659.64871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204659.68406: done with get_vars() 41175 1727204659.68464: done getting variables 41175 1727204659.68546: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.755) 0:00:26.824 ***** 41175 1727204659.68586: entering _queue_task() for managed-node3/service 41175 1727204659.69042: worker is 1 (out of 1 available) 41175 1727204659.69058: exiting _queue_task() for managed-node3/service 41175 1727204659.69074: done queuing things up, now waiting for results queue to drain 41175 1727204659.69076: waiting for pending results... 41175 1727204659.69382: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204659.69562: in run() - task 12b410aa-8751-f070-39c4-000000000074 41175 1727204659.69578: variable 'ansible_search_path' from source: unknown 41175 1727204659.69584: variable 'ansible_search_path' from source: unknown 41175 1727204659.69795: calling self._execute() 41175 1727204659.69800: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204659.69803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204659.69806: variable 'omit' from source: magic vars 41175 1727204659.70305: variable 'ansible_distribution_major_version' from source: facts 41175 1727204659.70318: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204659.70494: variable 'network_provider' from source: set_fact 41175 1727204659.70507: Evaluated conditional (network_provider == "nm"): True 41175 1727204659.70642: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204659.70765: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204659.71011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204659.74444: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204659.74540: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204659.74585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204659.74643: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204659.74676: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204659.74795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204659.74841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204659.74879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204659.74936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204659.74957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204659.75021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204659.75193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204659.75197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204659.75200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204659.75234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204659.75288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204659.75331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204659.75361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204659.75413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204659.75445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204659.75663: variable 'network_connections' from source: task vars 41175 1727204659.75679: variable 'interface' from source: set_fact 41175 1727204659.75826: variable 'interface' from source: set_fact 41175 1727204659.75837: variable 'interface' from source: set_fact 41175 1727204659.76023: variable 'interface' from source: set_fact 41175 1727204659.76195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204659.76394: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204659.76451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204659.76488: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204659.76538: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204659.76591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204659.76630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204659.76665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204659.76698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204659.76894: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204659.77149: variable 'network_connections' from source: task vars 41175 1727204659.77165: variable 'interface' from source: set_fact 41175 1727204659.77244: variable 'interface' from source: set_fact 41175 1727204659.77251: variable 'interface' from source: set_fact 41175 1727204659.77334: variable 'interface' from source: set_fact 41175 1727204659.77407: Evaluated conditional (__network_wpa_supplicant_required): False 41175 1727204659.77411: when evaluation is False, skipping this task 41175 1727204659.77414: _execute() done 41175 1727204659.77425: dumping result to json 41175 1727204659.77428: done dumping result, returning 41175 1727204659.77627: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-f070-39c4-000000000074] 41175 1727204659.77630: sending task result for task 12b410aa-8751-f070-39c4-000000000074 41175 1727204659.77704: done sending task result for task 12b410aa-8751-f070-39c4-000000000074 41175 1727204659.77708: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41175 1727204659.77773: no more pending results, returning what we have 41175 1727204659.77778: results queue empty 41175 1727204659.77780: checking for any_errors_fatal 41175 1727204659.77911: done checking for any_errors_fatal 41175 1727204659.77912: checking for max_fail_percentage 41175 1727204659.77915: done checking for max_fail_percentage 41175 1727204659.77916: checking to see if all hosts have failed and the running result is not ok 41175 1727204659.77920: done checking to see if all hosts have failed 41175 1727204659.77921: getting the remaining hosts for this loop 41175 1727204659.77923: done getting the remaining hosts for this loop 41175 1727204659.77927: getting the next task for host managed-node3 41175 1727204659.77935: done getting next task for host managed-node3 41175 1727204659.77940: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204659.77943: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204659.77965: getting variables 41175 1727204659.77967: in VariableManager get_vars() 41175 1727204659.78167: Calling all_inventory to load vars for managed-node3 41175 1727204659.78170: Calling groups_inventory to load vars for managed-node3 41175 1727204659.78173: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204659.78191: Calling all_plugins_play to load vars for managed-node3 41175 1727204659.78195: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204659.78201: Calling groups_plugins_play to load vars for managed-node3 41175 1727204659.80729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204659.83655: done with get_vars() 41175 1727204659.83709: done getting variables 41175 1727204659.83786: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.152) 0:00:26.977 ***** 41175 1727204659.83828: entering _queue_task() for managed-node3/service 41175 1727204659.84215: worker is 1 (out of 1 available) 41175 1727204659.84231: exiting _queue_task() for managed-node3/service 41175 1727204659.84245: done queuing things up, now waiting for results queue to drain 41175 1727204659.84247: waiting for pending results... 41175 1727204659.84572: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204659.84756: in run() - task 12b410aa-8751-f070-39c4-000000000075 41175 1727204659.84779: variable 'ansible_search_path' from source: unknown 41175 1727204659.84787: variable 'ansible_search_path' from source: unknown 41175 1727204659.84837: calling self._execute() 41175 1727204659.84949: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204659.84963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204659.84982: variable 'omit' from source: magic vars 41175 1727204659.85428: variable 'ansible_distribution_major_version' from source: facts 41175 1727204659.85447: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204659.85611: variable 'network_provider' from source: set_fact 41175 1727204659.85624: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204659.85633: when evaluation is False, skipping this task 41175 1727204659.85640: _execute() done 41175 1727204659.85795: dumping result to json 41175 1727204659.85798: done dumping result, returning 41175 1727204659.85801: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-f070-39c4-000000000075] 41175 1727204659.85804: sending task result for task 12b410aa-8751-f070-39c4-000000000075 41175 1727204659.85877: done sending task result for task 12b410aa-8751-f070-39c4-000000000075 41175 1727204659.85881: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204659.85934: no more pending results, returning what we have 41175 1727204659.85940: results queue empty 41175 1727204659.85941: checking for any_errors_fatal 41175 1727204659.85957: done checking for any_errors_fatal 41175 1727204659.85958: checking for max_fail_percentage 41175 1727204659.85960: done checking for max_fail_percentage 41175 1727204659.85962: checking to see if all hosts have failed and the running result is not ok 41175 1727204659.85963: done checking to see if all hosts have failed 41175 1727204659.85964: getting the remaining hosts for this loop 41175 1727204659.85967: done getting the remaining hosts for this loop 41175 1727204659.85972: getting the next task for host managed-node3 41175 1727204659.85981: done getting next task for host managed-node3 41175 1727204659.85986: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204659.85993: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204659.86017: getting variables 41175 1727204659.86020: in VariableManager get_vars() 41175 1727204659.86066: Calling all_inventory to load vars for managed-node3 41175 1727204659.86069: Calling groups_inventory to load vars for managed-node3 41175 1727204659.86072: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204659.86088: Calling all_plugins_play to load vars for managed-node3 41175 1727204659.86297: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204659.86303: Calling groups_plugins_play to load vars for managed-node3 41175 1727204659.88580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204659.91500: done with get_vars() 41175 1727204659.91546: done getting variables 41175 1727204659.91624: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.078) 0:00:27.055 ***** 41175 1727204659.91668: entering _queue_task() for managed-node3/copy 41175 1727204659.92047: worker is 1 (out of 1 available) 41175 1727204659.92062: exiting _queue_task() for managed-node3/copy 41175 1727204659.92076: done queuing things up, now waiting for results queue to drain 41175 1727204659.92078: waiting for pending results... 41175 1727204659.92513: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204659.92588: in run() - task 12b410aa-8751-f070-39c4-000000000076 41175 1727204659.92618: variable 'ansible_search_path' from source: unknown 41175 1727204659.92628: variable 'ansible_search_path' from source: unknown 41175 1727204659.92669: calling self._execute() 41175 1727204659.92777: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204659.92794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204659.92811: variable 'omit' from source: magic vars 41175 1727204659.93264: variable 'ansible_distribution_major_version' from source: facts 41175 1727204659.93284: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204659.93432: variable 'network_provider' from source: set_fact 41175 1727204659.93446: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204659.93455: when evaluation is False, skipping this task 41175 1727204659.93473: _execute() done 41175 1727204659.93476: dumping result to json 41175 1727204659.93695: done dumping result, returning 41175 1727204659.93700: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-f070-39c4-000000000076] 41175 1727204659.93703: sending task result for task 12b410aa-8751-f070-39c4-000000000076 41175 1727204659.93782: done sending task result for task 12b410aa-8751-f070-39c4-000000000076 41175 1727204659.93786: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41175 1727204659.93842: no more pending results, returning what we have 41175 1727204659.93848: results queue empty 41175 1727204659.93850: checking for any_errors_fatal 41175 1727204659.93856: done checking for any_errors_fatal 41175 1727204659.93857: checking for max_fail_percentage 41175 1727204659.93859: done checking for max_fail_percentage 41175 1727204659.93860: checking to see if all hosts have failed and the running result is not ok 41175 1727204659.93861: done checking to see if all hosts have failed 41175 1727204659.93862: getting the remaining hosts for this loop 41175 1727204659.93864: done getting the remaining hosts for this loop 41175 1727204659.93870: getting the next task for host managed-node3 41175 1727204659.93878: done getting next task for host managed-node3 41175 1727204659.93884: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204659.93890: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204659.93915: getting variables 41175 1727204659.93918: in VariableManager get_vars() 41175 1727204659.93969: Calling all_inventory to load vars for managed-node3 41175 1727204659.93973: Calling groups_inventory to load vars for managed-node3 41175 1727204659.93976: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204659.94173: Calling all_plugins_play to load vars for managed-node3 41175 1727204659.94179: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204659.94184: Calling groups_plugins_play to load vars for managed-node3 41175 1727204659.96307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204659.98055: done with get_vars() 41175 1727204659.98085: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.065) 0:00:27.120 ***** 41175 1727204659.98169: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204659.98456: worker is 1 (out of 1 available) 41175 1727204659.98471: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204659.98484: done queuing things up, now waiting for results queue to drain 41175 1727204659.98486: waiting for pending results... 41175 1727204659.98700: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204659.98821: in run() - task 12b410aa-8751-f070-39c4-000000000077 41175 1727204659.98838: variable 'ansible_search_path' from source: unknown 41175 1727204659.98842: variable 'ansible_search_path' from source: unknown 41175 1727204659.98876: calling self._execute() 41175 1727204659.98967: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204659.98975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204659.98987: variable 'omit' from source: magic vars 41175 1727204659.99496: variable 'ansible_distribution_major_version' from source: facts 41175 1727204659.99500: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204659.99504: variable 'omit' from source: magic vars 41175 1727204659.99528: variable 'omit' from source: magic vars 41175 1727204659.99738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204660.02052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204660.02110: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204660.02148: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204660.02180: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204660.02205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204660.02279: variable 'network_provider' from source: set_fact 41175 1727204660.02403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204660.02430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204660.02451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204660.02490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204660.02505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204660.02567: variable 'omit' from source: magic vars 41175 1727204660.02667: variable 'omit' from source: magic vars 41175 1727204660.02760: variable 'network_connections' from source: task vars 41175 1727204660.02772: variable 'interface' from source: set_fact 41175 1727204660.02836: variable 'interface' from source: set_fact 41175 1727204660.02845: variable 'interface' from source: set_fact 41175 1727204660.02898: variable 'interface' from source: set_fact 41175 1727204660.03088: variable 'omit' from source: magic vars 41175 1727204660.03098: variable '__lsr_ansible_managed' from source: task vars 41175 1727204660.03157: variable '__lsr_ansible_managed' from source: task vars 41175 1727204660.03387: Loaded config def from plugin (lookup/template) 41175 1727204660.03392: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41175 1727204660.03447: File lookup term: get_ansible_managed.j2 41175 1727204660.03451: variable 'ansible_search_path' from source: unknown 41175 1727204660.03455: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41175 1727204660.03461: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41175 1727204660.03478: variable 'ansible_search_path' from source: unknown 41175 1727204660.19495: variable 'ansible_managed' from source: unknown 41175 1727204660.19698: variable 'omit' from source: magic vars 41175 1727204660.19704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204660.19707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204660.19710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204660.19795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204660.19799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204660.19804: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204660.19807: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.19809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.19880: Set connection var ansible_shell_executable to /bin/sh 41175 1727204660.19884: Set connection var ansible_shell_type to sh 41175 1727204660.19894: Set connection var ansible_pipelining to False 41175 1727204660.19924: Set connection var ansible_timeout to 10 41175 1727204660.19928: Set connection var ansible_connection to ssh 41175 1727204660.19939: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204660.19952: variable 'ansible_shell_executable' from source: unknown 41175 1727204660.19955: variable 'ansible_connection' from source: unknown 41175 1727204660.19960: variable 'ansible_module_compression' from source: unknown 41175 1727204660.19963: variable 'ansible_shell_type' from source: unknown 41175 1727204660.19968: variable 'ansible_shell_executable' from source: unknown 41175 1727204660.19971: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.19978: variable 'ansible_pipelining' from source: unknown 41175 1727204660.19980: variable 'ansible_timeout' from source: unknown 41175 1727204660.20252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.20258: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204660.20269: variable 'omit' from source: magic vars 41175 1727204660.20271: starting attempt loop 41175 1727204660.20274: running the handler 41175 1727204660.20276: _low_level_execute_command(): starting 41175 1727204660.20278: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204660.20876: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204660.20906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204660.20910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204660.20966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204660.20970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204660.20972: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204660.20975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.20978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204660.20980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204660.20982: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41175 1727204660.21097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204660.21232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204660.21270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204660.23051: stdout chunk (state=3): >>>/root <<< 41175 1727204660.23306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204660.23313: stdout chunk (state=3): >>><<< 41175 1727204660.23329: stderr chunk (state=3): >>><<< 41175 1727204660.23425: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204660.23439: _low_level_execute_command(): starting 41175 1727204660.23449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194 `" && echo ansible-tmp-1727204660.2342596-42411-72949181721194="` echo /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194 `" ) && sleep 0' 41175 1727204660.24613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204660.24661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204660.24665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204660.24668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204660.24671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204660.24674: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204660.24676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.24699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204660.24821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204660.24834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204660.24867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204660.26929: stdout chunk (state=3): >>>ansible-tmp-1727204660.2342596-42411-72949181721194=/root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194 <<< 41175 1727204660.27132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204660.27136: stdout chunk (state=3): >>><<< 41175 1727204660.27138: stderr chunk (state=3): >>><<< 41175 1727204660.27296: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204660.2342596-42411-72949181721194=/root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204660.27300: variable 'ansible_module_compression' from source: unknown 41175 1727204660.27302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41175 1727204660.27337: variable 'ansible_facts' from source: unknown 41175 1727204660.27499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/AnsiballZ_network_connections.py 41175 1727204660.27777: Sending initial data 41175 1727204660.27781: Sent initial data (167 bytes) 41175 1727204660.28460: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204660.28505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.28574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204660.28601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204660.28624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204660.28698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204660.30358: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204660.30433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204660.30457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmphnav2_bl /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/AnsiballZ_network_connections.py <<< 41175 1727204660.30461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/AnsiballZ_network_connections.py" <<< 41175 1727204660.30504: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmphnav2_bl" to remote "/root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/AnsiballZ_network_connections.py" <<< 41175 1727204660.31802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204660.31872: stderr chunk (state=3): >>><<< 41175 1727204660.31875: stdout chunk (state=3): >>><<< 41175 1727204660.31904: done transferring module to remote 41175 1727204660.31916: _low_level_execute_command(): starting 41175 1727204660.31924: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/ /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/AnsiballZ_network_connections.py && sleep 0' 41175 1727204660.32362: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204660.32372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204660.32396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.32399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204660.32411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.32457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204660.32477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204660.32514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204660.34432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204660.34466: stderr chunk (state=3): >>><<< 41175 1727204660.34469: stdout chunk (state=3): >>><<< 41175 1727204660.34486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204660.34491: _low_level_execute_command(): starting 41175 1727204660.34498: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/AnsiballZ_network_connections.py && sleep 0' 41175 1727204660.35182: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204660.35187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.35265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204660.35321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204660.35365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204660.66701: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41175 1727204660.68847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204660.68910: stderr chunk (state=3): >>><<< 41175 1727204660.68915: stdout chunk (state=3): >>><<< 41175 1727204660.68939: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204660.69007: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 'custom'}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 'custom'}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 'custom', 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204660.69015: _low_level_execute_command(): starting 41175 1727204660.69024: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204660.2342596-42411-72949181721194/ > /dev/null 2>&1 && sleep 0' 41175 1727204660.69611: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204660.69615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204660.69620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.69657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204660.69746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204660.69757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204660.69779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204660.69820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204660.71781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204660.71832: stderr chunk (state=3): >>><<< 41175 1727204660.71835: stdout chunk (state=3): >>><<< 41175 1727204660.71850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204660.71857: handler run complete 41175 1727204660.71908: attempt loop complete, returning result 41175 1727204660.71911: _execute() done 41175 1727204660.71914: dumping result to json 41175 1727204660.71928: done dumping result, returning 41175 1727204660.71937: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-f070-39c4-000000000077] 41175 1727204660.71942: sending task result for task 12b410aa-8751-f070-39c4-000000000077 41175 1727204660.72074: done sending task result for task 12b410aa-8751-f070-39c4-000000000077 41175 1727204660.72077: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (is-modified) [005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied 41175 1727204660.72280: no more pending results, returning what we have 41175 1727204660.72284: results queue empty 41175 1727204660.72285: checking for any_errors_fatal 41175 1727204660.72301: done checking for any_errors_fatal 41175 1727204660.72302: checking for max_fail_percentage 41175 1727204660.72304: done checking for max_fail_percentage 41175 1727204660.72305: checking to see if all hosts have failed and the running result is not ok 41175 1727204660.72306: done checking to see if all hosts have failed 41175 1727204660.72307: getting the remaining hosts for this loop 41175 1727204660.72308: done getting the remaining hosts for this loop 41175 1727204660.72313: getting the next task for host managed-node3 41175 1727204660.72321: done getting next task for host managed-node3 41175 1727204660.72325: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204660.72327: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204660.72340: getting variables 41175 1727204660.72342: in VariableManager get_vars() 41175 1727204660.72382: Calling all_inventory to load vars for managed-node3 41175 1727204660.72385: Calling groups_inventory to load vars for managed-node3 41175 1727204660.72388: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204660.72410: Calling all_plugins_play to load vars for managed-node3 41175 1727204660.72415: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204660.72421: Calling groups_plugins_play to load vars for managed-node3 41175 1727204660.73812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204660.75591: done with get_vars() 41175 1727204660.75626: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.775) 0:00:27.895 ***** 41175 1727204660.75710: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204660.76005: worker is 1 (out of 1 available) 41175 1727204660.76022: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204660.76034: done queuing things up, now waiting for results queue to drain 41175 1727204660.76036: waiting for pending results... 41175 1727204660.76309: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204660.76477: in run() - task 12b410aa-8751-f070-39c4-000000000078 41175 1727204660.76505: variable 'ansible_search_path' from source: unknown 41175 1727204660.76516: variable 'ansible_search_path' from source: unknown 41175 1727204660.76567: calling self._execute() 41175 1727204660.76652: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.76658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.76670: variable 'omit' from source: magic vars 41175 1727204660.77050: variable 'ansible_distribution_major_version' from source: facts 41175 1727204660.77065: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204660.77214: variable 'network_state' from source: role '' defaults 41175 1727204660.77220: Evaluated conditional (network_state != {}): False 41175 1727204660.77223: when evaluation is False, skipping this task 41175 1727204660.77226: _execute() done 41175 1727204660.77229: dumping result to json 41175 1727204660.77233: done dumping result, returning 41175 1727204660.77236: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-f070-39c4-000000000078] 41175 1727204660.77269: sending task result for task 12b410aa-8751-f070-39c4-000000000078 41175 1727204660.77371: done sending task result for task 12b410aa-8751-f070-39c4-000000000078 41175 1727204660.77374: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204660.77452: no more pending results, returning what we have 41175 1727204660.77456: results queue empty 41175 1727204660.77457: checking for any_errors_fatal 41175 1727204660.77469: done checking for any_errors_fatal 41175 1727204660.77470: checking for max_fail_percentage 41175 1727204660.77472: done checking for max_fail_percentage 41175 1727204660.77473: checking to see if all hosts have failed and the running result is not ok 41175 1727204660.77474: done checking to see if all hosts have failed 41175 1727204660.77474: getting the remaining hosts for this loop 41175 1727204660.77476: done getting the remaining hosts for this loop 41175 1727204660.77481: getting the next task for host managed-node3 41175 1727204660.77487: done getting next task for host managed-node3 41175 1727204660.77493: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204660.77496: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204660.77514: getting variables 41175 1727204660.77516: in VariableManager get_vars() 41175 1727204660.77555: Calling all_inventory to load vars for managed-node3 41175 1727204660.77558: Calling groups_inventory to load vars for managed-node3 41175 1727204660.77561: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204660.77571: Calling all_plugins_play to load vars for managed-node3 41175 1727204660.77574: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204660.77578: Calling groups_plugins_play to load vars for managed-node3 41175 1727204660.79198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204660.84735: done with get_vars() 41175 1727204660.84762: done getting variables 41175 1727204660.84807: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.091) 0:00:27.987 ***** 41175 1727204660.84830: entering _queue_task() for managed-node3/debug 41175 1727204660.85112: worker is 1 (out of 1 available) 41175 1727204660.85127: exiting _queue_task() for managed-node3/debug 41175 1727204660.85140: done queuing things up, now waiting for results queue to drain 41175 1727204660.85142: waiting for pending results... 41175 1727204660.85353: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204660.85464: in run() - task 12b410aa-8751-f070-39c4-000000000079 41175 1727204660.85482: variable 'ansible_search_path' from source: unknown 41175 1727204660.85488: variable 'ansible_search_path' from source: unknown 41175 1727204660.85518: calling self._execute() 41175 1727204660.85630: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.85639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.85649: variable 'omit' from source: magic vars 41175 1727204660.86001: variable 'ansible_distribution_major_version' from source: facts 41175 1727204660.86012: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204660.86021: variable 'omit' from source: magic vars 41175 1727204660.86073: variable 'omit' from source: magic vars 41175 1727204660.86104: variable 'omit' from source: magic vars 41175 1727204660.86144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204660.86176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204660.86195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204660.86228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204660.86239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204660.86270: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204660.86275: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.86278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.86366: Set connection var ansible_shell_executable to /bin/sh 41175 1727204660.86371: Set connection var ansible_shell_type to sh 41175 1727204660.86381: Set connection var ansible_pipelining to False 41175 1727204660.86394: Set connection var ansible_timeout to 10 41175 1727204660.86429: Set connection var ansible_connection to ssh 41175 1727204660.86433: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204660.86454: variable 'ansible_shell_executable' from source: unknown 41175 1727204660.86458: variable 'ansible_connection' from source: unknown 41175 1727204660.86461: variable 'ansible_module_compression' from source: unknown 41175 1727204660.86465: variable 'ansible_shell_type' from source: unknown 41175 1727204660.86468: variable 'ansible_shell_executable' from source: unknown 41175 1727204660.86471: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.86476: variable 'ansible_pipelining' from source: unknown 41175 1727204660.86486: variable 'ansible_timeout' from source: unknown 41175 1727204660.86489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.86656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204660.86666: variable 'omit' from source: magic vars 41175 1727204660.86672: starting attempt loop 41175 1727204660.86675: running the handler 41175 1727204660.86837: variable '__network_connections_result' from source: set_fact 41175 1727204660.86905: handler run complete 41175 1727204660.86932: attempt loop complete, returning result 41175 1727204660.86935: _execute() done 41175 1727204660.86940: dumping result to json 41175 1727204660.86943: done dumping result, returning 41175 1727204660.86970: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-f070-39c4-000000000079] 41175 1727204660.86973: sending task result for task 12b410aa-8751-f070-39c4-000000000079 41175 1727204660.87057: done sending task result for task 12b410aa-8751-f070-39c4-000000000079 41175 1727204660.87060: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } 41175 1727204660.87140: no more pending results, returning what we have 41175 1727204660.87144: results queue empty 41175 1727204660.87145: checking for any_errors_fatal 41175 1727204660.87157: done checking for any_errors_fatal 41175 1727204660.87158: checking for max_fail_percentage 41175 1727204660.87160: done checking for max_fail_percentage 41175 1727204660.87161: checking to see if all hosts have failed and the running result is not ok 41175 1727204660.87162: done checking to see if all hosts have failed 41175 1727204660.87163: getting the remaining hosts for this loop 41175 1727204660.87165: done getting the remaining hosts for this loop 41175 1727204660.87169: getting the next task for host managed-node3 41175 1727204660.87177: done getting next task for host managed-node3 41175 1727204660.87181: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204660.87184: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204660.87198: getting variables 41175 1727204660.87200: in VariableManager get_vars() 41175 1727204660.87243: Calling all_inventory to load vars for managed-node3 41175 1727204660.87246: Calling groups_inventory to load vars for managed-node3 41175 1727204660.87248: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204660.87259: Calling all_plugins_play to load vars for managed-node3 41175 1727204660.87263: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204660.87266: Calling groups_plugins_play to load vars for managed-node3 41175 1727204660.88722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204660.90848: done with get_vars() 41175 1727204660.90882: done getting variables 41175 1727204660.90931: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.061) 0:00:28.048 ***** 41175 1727204660.90962: entering _queue_task() for managed-node3/debug 41175 1727204660.91275: worker is 1 (out of 1 available) 41175 1727204660.91293: exiting _queue_task() for managed-node3/debug 41175 1727204660.91306: done queuing things up, now waiting for results queue to drain 41175 1727204660.91308: waiting for pending results... 41175 1727204660.91645: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204660.91722: in run() - task 12b410aa-8751-f070-39c4-00000000007a 41175 1727204660.91727: variable 'ansible_search_path' from source: unknown 41175 1727204660.91730: variable 'ansible_search_path' from source: unknown 41175 1727204660.91765: calling self._execute() 41175 1727204660.91879: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.91897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.91907: variable 'omit' from source: magic vars 41175 1727204660.92237: variable 'ansible_distribution_major_version' from source: facts 41175 1727204660.92248: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204660.92255: variable 'omit' from source: magic vars 41175 1727204660.92307: variable 'omit' from source: magic vars 41175 1727204660.92340: variable 'omit' from source: magic vars 41175 1727204660.92375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204660.92409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204660.92429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204660.92445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204660.92456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204660.92485: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204660.92491: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.92494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.92579: Set connection var ansible_shell_executable to /bin/sh 41175 1727204660.92584: Set connection var ansible_shell_type to sh 41175 1727204660.92590: Set connection var ansible_pipelining to False 41175 1727204660.92601: Set connection var ansible_timeout to 10 41175 1727204660.92613: Set connection var ansible_connection to ssh 41175 1727204660.92619: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204660.92638: variable 'ansible_shell_executable' from source: unknown 41175 1727204660.92641: variable 'ansible_connection' from source: unknown 41175 1727204660.92644: variable 'ansible_module_compression' from source: unknown 41175 1727204660.92647: variable 'ansible_shell_type' from source: unknown 41175 1727204660.92649: variable 'ansible_shell_executable' from source: unknown 41175 1727204660.92652: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.92658: variable 'ansible_pipelining' from source: unknown 41175 1727204660.92661: variable 'ansible_timeout' from source: unknown 41175 1727204660.92667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.92786: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204660.92799: variable 'omit' from source: magic vars 41175 1727204660.92807: starting attempt loop 41175 1727204660.92810: running the handler 41175 1727204660.92857: variable '__network_connections_result' from source: set_fact 41175 1727204660.92924: variable '__network_connections_result' from source: set_fact 41175 1727204660.93092: handler run complete 41175 1727204660.93131: attempt loop complete, returning result 41175 1727204660.93134: _execute() done 41175 1727204660.93139: dumping result to json 41175 1727204660.93142: done dumping result, returning 41175 1727204660.93155: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-f070-39c4-00000000007a] 41175 1727204660.93158: sending task result for task 12b410aa-8751-f070-39c4-00000000007a 41175 1727204660.93265: done sending task result for task 12b410aa-8751-f070-39c4-00000000007a 41175 1727204660.93270: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 270b38fe-b70b-4444-a0b4-74394ad4b2b5 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } } 41175 1727204660.93413: no more pending results, returning what we have 41175 1727204660.93419: results queue empty 41175 1727204660.93420: checking for any_errors_fatal 41175 1727204660.93425: done checking for any_errors_fatal 41175 1727204660.93426: checking for max_fail_percentage 41175 1727204660.93428: done checking for max_fail_percentage 41175 1727204660.93433: checking to see if all hosts have failed and the running result is not ok 41175 1727204660.93434: done checking to see if all hosts have failed 41175 1727204660.93435: getting the remaining hosts for this loop 41175 1727204660.93436: done getting the remaining hosts for this loop 41175 1727204660.93440: getting the next task for host managed-node3 41175 1727204660.93445: done getting next task for host managed-node3 41175 1727204660.93449: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204660.93451: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204660.93463: getting variables 41175 1727204660.93465: in VariableManager get_vars() 41175 1727204660.93498: Calling all_inventory to load vars for managed-node3 41175 1727204660.93507: Calling groups_inventory to load vars for managed-node3 41175 1727204660.93509: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204660.93518: Calling all_plugins_play to load vars for managed-node3 41175 1727204660.93520: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204660.93523: Calling groups_plugins_play to load vars for managed-node3 41175 1727204660.95540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204660.98105: done with get_vars() 41175 1727204660.98145: done getting variables 41175 1727204660.98217: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.072) 0:00:28.121 ***** 41175 1727204660.98249: entering _queue_task() for managed-node3/debug 41175 1727204660.98604: worker is 1 (out of 1 available) 41175 1727204660.98620: exiting _queue_task() for managed-node3/debug 41175 1727204660.98634: done queuing things up, now waiting for results queue to drain 41175 1727204660.98636: waiting for pending results... 41175 1727204660.98895: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204660.99032: in run() - task 12b410aa-8751-f070-39c4-00000000007b 41175 1727204660.99074: variable 'ansible_search_path' from source: unknown 41175 1727204660.99084: variable 'ansible_search_path' from source: unknown 41175 1727204660.99135: calling self._execute() 41175 1727204660.99226: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204660.99231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204660.99234: variable 'omit' from source: magic vars 41175 1727204660.99703: variable 'ansible_distribution_major_version' from source: facts 41175 1727204660.99708: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204660.99865: variable 'network_state' from source: role '' defaults 41175 1727204660.99870: Evaluated conditional (network_state != {}): False 41175 1727204660.99873: when evaluation is False, skipping this task 41175 1727204660.99876: _execute() done 41175 1727204660.99879: dumping result to json 41175 1727204660.99881: done dumping result, returning 41175 1727204660.99941: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-f070-39c4-00000000007b] 41175 1727204660.99945: sending task result for task 12b410aa-8751-f070-39c4-00000000007b 41175 1727204661.00035: done sending task result for task 12b410aa-8751-f070-39c4-00000000007b 41175 1727204661.00038: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41175 1727204661.00148: no more pending results, returning what we have 41175 1727204661.00154: results queue empty 41175 1727204661.00155: checking for any_errors_fatal 41175 1727204661.00164: done checking for any_errors_fatal 41175 1727204661.00165: checking for max_fail_percentage 41175 1727204661.00167: done checking for max_fail_percentage 41175 1727204661.00168: checking to see if all hosts have failed and the running result is not ok 41175 1727204661.00169: done checking to see if all hosts have failed 41175 1727204661.00170: getting the remaining hosts for this loop 41175 1727204661.00171: done getting the remaining hosts for this loop 41175 1727204661.00175: getting the next task for host managed-node3 41175 1727204661.00187: done getting next task for host managed-node3 41175 1727204661.00195: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204661.00200: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204661.00221: getting variables 41175 1727204661.00223: in VariableManager get_vars() 41175 1727204661.00281: Calling all_inventory to load vars for managed-node3 41175 1727204661.00285: Calling groups_inventory to load vars for managed-node3 41175 1727204661.00288: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204661.00303: Calling all_plugins_play to load vars for managed-node3 41175 1727204661.00307: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204661.00310: Calling groups_plugins_play to load vars for managed-node3 41175 1727204661.02649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204661.05183: done with get_vars() 41175 1727204661.05209: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:21 -0400 (0:00:00.070) 0:00:28.191 ***** 41175 1727204661.05298: entering _queue_task() for managed-node3/ping 41175 1727204661.05570: worker is 1 (out of 1 available) 41175 1727204661.05586: exiting _queue_task() for managed-node3/ping 41175 1727204661.05600: done queuing things up, now waiting for results queue to drain 41175 1727204661.05602: waiting for pending results... 41175 1727204661.05815: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204661.05936: in run() - task 12b410aa-8751-f070-39c4-00000000007c 41175 1727204661.05954: variable 'ansible_search_path' from source: unknown 41175 1727204661.05958: variable 'ansible_search_path' from source: unknown 41175 1727204661.05990: calling self._execute() 41175 1727204661.06077: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204661.06085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204661.06099: variable 'omit' from source: magic vars 41175 1727204661.06433: variable 'ansible_distribution_major_version' from source: facts 41175 1727204661.06444: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204661.06451: variable 'omit' from source: magic vars 41175 1727204661.06506: variable 'omit' from source: magic vars 41175 1727204661.06536: variable 'omit' from source: magic vars 41175 1727204661.06571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204661.06604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204661.06630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204661.06646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204661.06657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204661.06685: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204661.06690: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204661.06693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204661.06785: Set connection var ansible_shell_executable to /bin/sh 41175 1727204661.06788: Set connection var ansible_shell_type to sh 41175 1727204661.06796: Set connection var ansible_pipelining to False 41175 1727204661.06805: Set connection var ansible_timeout to 10 41175 1727204661.06815: Set connection var ansible_connection to ssh 41175 1727204661.06817: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204661.06846: variable 'ansible_shell_executable' from source: unknown 41175 1727204661.06849: variable 'ansible_connection' from source: unknown 41175 1727204661.06852: variable 'ansible_module_compression' from source: unknown 41175 1727204661.06856: variable 'ansible_shell_type' from source: unknown 41175 1727204661.06873: variable 'ansible_shell_executable' from source: unknown 41175 1727204661.06876: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204661.06878: variable 'ansible_pipelining' from source: unknown 41175 1727204661.06892: variable 'ansible_timeout' from source: unknown 41175 1727204661.06895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204661.07153: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204661.07157: variable 'omit' from source: magic vars 41175 1727204661.07160: starting attempt loop 41175 1727204661.07162: running the handler 41175 1727204661.07165: _low_level_execute_command(): starting 41175 1727204661.07167: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204661.07895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204661.07901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204661.07905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.07978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204661.08016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204661.08034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.08064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.09819: stdout chunk (state=3): >>>/root <<< 41175 1727204661.09930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.09977: stderr chunk (state=3): >>><<< 41175 1727204661.09981: stdout chunk (state=3): >>><<< 41175 1727204661.10001: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204661.10013: _low_level_execute_command(): starting 41175 1727204661.10024: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168 `" && echo ansible-tmp-1727204661.1000104-42438-234233049502168="` echo /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168 `" ) && sleep 0' 41175 1727204661.10456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.10460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204661.10470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.10522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204661.10526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.10564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.12529: stdout chunk (state=3): >>>ansible-tmp-1727204661.1000104-42438-234233049502168=/root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168 <<< 41175 1727204661.12651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.12705: stderr chunk (state=3): >>><<< 41175 1727204661.12711: stdout chunk (state=3): >>><<< 41175 1727204661.12727: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204661.1000104-42438-234233049502168=/root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204661.12773: variable 'ansible_module_compression' from source: unknown 41175 1727204661.12808: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41175 1727204661.12846: variable 'ansible_facts' from source: unknown 41175 1727204661.12910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/AnsiballZ_ping.py 41175 1727204661.13025: Sending initial data 41175 1727204661.13029: Sent initial data (153 bytes) 41175 1727204661.13459: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.13495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204661.13498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204661.13501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204661.13503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204661.13506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.13564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204661.13568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.13609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.15219: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204661.15295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204661.15325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpg5k1l_0t /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/AnsiballZ_ping.py <<< 41175 1727204661.15329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/AnsiballZ_ping.py" <<< 41175 1727204661.15396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpg5k1l_0t" to remote "/root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/AnsiballZ_ping.py" <<< 41175 1727204661.16540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.16544: stdout chunk (state=3): >>><<< 41175 1727204661.16546: stderr chunk (state=3): >>><<< 41175 1727204661.16549: done transferring module to remote 41175 1727204661.16551: _low_level_execute_command(): starting 41175 1727204661.16553: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/ /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/AnsiballZ_ping.py && sleep 0' 41175 1727204661.17044: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.17048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.17063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.17066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.17113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204661.17122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.17157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.18985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.19040: stderr chunk (state=3): >>><<< 41175 1727204661.19043: stdout chunk (state=3): >>><<< 41175 1727204661.19060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204661.19063: _low_level_execute_command(): starting 41175 1727204661.19070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/AnsiballZ_ping.py && sleep 0' 41175 1727204661.19629: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.19633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204661.19637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.19662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204661.19709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.19782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.36878: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41175 1727204661.38346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204661.38411: stderr chunk (state=3): >>><<< 41175 1727204661.38415: stdout chunk (state=3): >>><<< 41175 1727204661.38437: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204661.38460: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204661.38471: _low_level_execute_command(): starting 41175 1727204661.38477: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204661.1000104-42438-234233049502168/ > /dev/null 2>&1 && sleep 0' 41175 1727204661.38951: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204661.38988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204661.38993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.38996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.38998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.39049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204661.39053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.39103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.41013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.41064: stderr chunk (state=3): >>><<< 41175 1727204661.41067: stdout chunk (state=3): >>><<< 41175 1727204661.41081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204661.41092: handler run complete 41175 1727204661.41109: attempt loop complete, returning result 41175 1727204661.41112: _execute() done 41175 1727204661.41115: dumping result to json 41175 1727204661.41121: done dumping result, returning 41175 1727204661.41132: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-f070-39c4-00000000007c] 41175 1727204661.41141: sending task result for task 12b410aa-8751-f070-39c4-00000000007c 41175 1727204661.41242: done sending task result for task 12b410aa-8751-f070-39c4-00000000007c 41175 1727204661.41246: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 41175 1727204661.41320: no more pending results, returning what we have 41175 1727204661.41324: results queue empty 41175 1727204661.41325: checking for any_errors_fatal 41175 1727204661.41333: done checking for any_errors_fatal 41175 1727204661.41334: checking for max_fail_percentage 41175 1727204661.41335: done checking for max_fail_percentage 41175 1727204661.41336: checking to see if all hosts have failed and the running result is not ok 41175 1727204661.41338: done checking to see if all hosts have failed 41175 1727204661.41339: getting the remaining hosts for this loop 41175 1727204661.41340: done getting the remaining hosts for this loop 41175 1727204661.41345: getting the next task for host managed-node3 41175 1727204661.41357: done getting next task for host managed-node3 41175 1727204661.41359: ^ task is: TASK: meta (role_complete) 41175 1727204661.41362: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204661.41376: getting variables 41175 1727204661.41378: in VariableManager get_vars() 41175 1727204661.41426: Calling all_inventory to load vars for managed-node3 41175 1727204661.41430: Calling groups_inventory to load vars for managed-node3 41175 1727204661.41432: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204661.41444: Calling all_plugins_play to load vars for managed-node3 41175 1727204661.41447: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204661.41450: Calling groups_plugins_play to load vars for managed-node3 41175 1727204661.42719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204661.44332: done with get_vars() 41175 1727204661.44356: done getting variables 41175 1727204661.44425: done queuing things up, now waiting for results queue to drain 41175 1727204661.44427: results queue empty 41175 1727204661.44428: checking for any_errors_fatal 41175 1727204661.44432: done checking for any_errors_fatal 41175 1727204661.44433: checking for max_fail_percentage 41175 1727204661.44434: done checking for max_fail_percentage 41175 1727204661.44435: checking to see if all hosts have failed and the running result is not ok 41175 1727204661.44436: done checking to see if all hosts have failed 41175 1727204661.44436: getting the remaining hosts for this loop 41175 1727204661.44437: done getting the remaining hosts for this loop 41175 1727204661.44439: getting the next task for host managed-node3 41175 1727204661.44443: done getting next task for host managed-node3 41175 1727204661.44445: ^ task is: TASK: Get the routes from the named route table 'custom' 41175 1727204661.44446: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204661.44448: getting variables 41175 1727204661.44449: in VariableManager get_vars() 41175 1727204661.44461: Calling all_inventory to load vars for managed-node3 41175 1727204661.44462: Calling groups_inventory to load vars for managed-node3 41175 1727204661.44464: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204661.44468: Calling all_plugins_play to load vars for managed-node3 41175 1727204661.44470: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204661.44473: Calling groups_plugins_play to load vars for managed-node3 41175 1727204661.45655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204661.47242: done with get_vars() 41175 1727204661.47266: done getting variables 41175 1727204661.47303: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the named route table 'custom'] ********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:121 Tuesday 24 September 2024 15:04:21 -0400 (0:00:00.420) 0:00:28.612 ***** 41175 1727204661.47326: entering _queue_task() for managed-node3/command 41175 1727204661.47644: worker is 1 (out of 1 available) 41175 1727204661.47658: exiting _queue_task() for managed-node3/command 41175 1727204661.47671: done queuing things up, now waiting for results queue to drain 41175 1727204661.47673: waiting for pending results... 41175 1727204661.47879: running TaskExecutor() for managed-node3/TASK: Get the routes from the named route table 'custom' 41175 1727204661.47959: in run() - task 12b410aa-8751-f070-39c4-0000000000ac 41175 1727204661.47973: variable 'ansible_search_path' from source: unknown 41175 1727204661.48008: calling self._execute() 41175 1727204661.48097: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204661.48104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204661.48118: variable 'omit' from source: magic vars 41175 1727204661.48463: variable 'ansible_distribution_major_version' from source: facts 41175 1727204661.48475: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204661.48482: variable 'omit' from source: magic vars 41175 1727204661.48504: variable 'omit' from source: magic vars 41175 1727204661.48536: variable 'omit' from source: magic vars 41175 1727204661.48574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204661.48607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204661.48627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204661.48643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204661.48657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204661.48686: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204661.48690: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204661.48692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204661.48782: Set connection var ansible_shell_executable to /bin/sh 41175 1727204661.48785: Set connection var ansible_shell_type to sh 41175 1727204661.48792: Set connection var ansible_pipelining to False 41175 1727204661.48802: Set connection var ansible_timeout to 10 41175 1727204661.48810: Set connection var ansible_connection to ssh 41175 1727204661.48815: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204661.48839: variable 'ansible_shell_executable' from source: unknown 41175 1727204661.48842: variable 'ansible_connection' from source: unknown 41175 1727204661.48844: variable 'ansible_module_compression' from source: unknown 41175 1727204661.48847: variable 'ansible_shell_type' from source: unknown 41175 1727204661.48851: variable 'ansible_shell_executable' from source: unknown 41175 1727204661.48855: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204661.48860: variable 'ansible_pipelining' from source: unknown 41175 1727204661.48864: variable 'ansible_timeout' from source: unknown 41175 1727204661.48870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204661.48992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204661.49004: variable 'omit' from source: magic vars 41175 1727204661.49010: starting attempt loop 41175 1727204661.49013: running the handler 41175 1727204661.49034: _low_level_execute_command(): starting 41175 1727204661.49040: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204661.49592: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.49626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.49630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.49633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.49698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204661.49703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204661.49706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.49739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.51497: stdout chunk (state=3): >>>/root <<< 41175 1727204661.51606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.51661: stderr chunk (state=3): >>><<< 41175 1727204661.51664: stdout chunk (state=3): >>><<< 41175 1727204661.51689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204661.51703: _low_level_execute_command(): starting 41175 1727204661.51709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281 `" && echo ansible-tmp-1727204661.5168679-42458-54655147809281="` echo /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281 `" ) && sleep 0' 41175 1727204661.52387: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204661.52413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.52420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204661.52424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204661.52441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.52513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.54497: stdout chunk (state=3): >>>ansible-tmp-1727204661.5168679-42458-54655147809281=/root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281 <<< 41175 1727204661.54688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.54696: stdout chunk (state=3): >>><<< 41175 1727204661.54705: stderr chunk (state=3): >>><<< 41175 1727204661.54729: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204661.5168679-42458-54655147809281=/root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204661.54767: variable 'ansible_module_compression' from source: unknown 41175 1727204661.54827: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204661.54899: variable 'ansible_facts' from source: unknown 41175 1727204661.54962: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/AnsiballZ_command.py 41175 1727204661.55205: Sending initial data 41175 1727204661.55209: Sent initial data (155 bytes) 41175 1727204661.55620: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204661.55634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.55649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.55704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204661.55737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.55756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.57378: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41175 1727204661.57394: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204661.57416: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204661.57448: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp0qcochfo /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/AnsiballZ_command.py <<< 41175 1727204661.57457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/AnsiballZ_command.py" <<< 41175 1727204661.57483: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp0qcochfo" to remote "/root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/AnsiballZ_command.py" <<< 41175 1727204661.58437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.58494: stderr chunk (state=3): >>><<< 41175 1727204661.58498: stdout chunk (state=3): >>><<< 41175 1727204661.58516: done transferring module to remote 41175 1727204661.58536: _low_level_execute_command(): starting 41175 1727204661.58626: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/ /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/AnsiballZ_command.py && sleep 0' 41175 1727204661.59215: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204661.59227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.59252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.59256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.59281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204661.59293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.59343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.80663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204661.80725: stderr chunk (state=3): >>><<< 41175 1727204661.80729: stdout chunk (state=3): >>><<< 41175 1727204661.80745: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204661.80750: _low_level_execute_command(): starting 41175 1727204661.80757: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/AnsiballZ_command.py && sleep 0' 41175 1727204661.81255: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.81259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204661.81262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.81264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204661.81266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204661.81328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204661.81332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204661.81380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204661.99177: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-24 15:04:21.986790", "end": "2024-09-24 15:04:21.990495", "delta": "0:00:00.003705", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204662.00978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204662.00982: stdout chunk (state=3): >>><<< 41175 1727204662.00985: stderr chunk (state=3): >>><<< 41175 1727204662.00996: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-24 15:04:21.986790", "end": "2024-09-24 15:04:21.990495", "delta": "0:00:00.003705", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204662.01001: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204662.01010: _low_level_execute_command(): starting 41175 1727204662.01119: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204661.5168679-42458-54655147809281/ > /dev/null 2>&1 && sleep 0' 41175 1727204662.01761: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204662.01781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204662.01799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204662.01864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204662.03748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204662.03802: stderr chunk (state=3): >>><<< 41175 1727204662.03806: stdout chunk (state=3): >>><<< 41175 1727204662.03824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204662.03832: handler run complete 41175 1727204662.03855: Evaluated conditional (False): False 41175 1727204662.03866: attempt loop complete, returning result 41175 1727204662.03869: _execute() done 41175 1727204662.03874: dumping result to json 41175 1727204662.03882: done dumping result, returning 41175 1727204662.03894: done running TaskExecutor() for managed-node3/TASK: Get the routes from the named route table 'custom' [12b410aa-8751-f070-39c4-0000000000ac] 41175 1727204662.03902: sending task result for task 12b410aa-8751-f070-39c4-0000000000ac 41175 1727204662.04019: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ac 41175 1727204662.04023: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "custom" ], "delta": "0:00:00.003705", "end": "2024-09-24 15:04:21.990495", "rc": 0, "start": "2024-09-24 15:04:21.986790" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 41175 1727204662.04134: no more pending results, returning what we have 41175 1727204662.04139: results queue empty 41175 1727204662.04140: checking for any_errors_fatal 41175 1727204662.04142: done checking for any_errors_fatal 41175 1727204662.04143: checking for max_fail_percentage 41175 1727204662.04145: done checking for max_fail_percentage 41175 1727204662.04146: checking to see if all hosts have failed and the running result is not ok 41175 1727204662.04147: done checking to see if all hosts have failed 41175 1727204662.04148: getting the remaining hosts for this loop 41175 1727204662.04149: done getting the remaining hosts for this loop 41175 1727204662.04154: getting the next task for host managed-node3 41175 1727204662.04161: done getting next task for host managed-node3 41175 1727204662.04164: ^ task is: TASK: Assert that the named route table 'custom' contains the specified route 41175 1727204662.04166: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204662.04169: getting variables 41175 1727204662.04171: in VariableManager get_vars() 41175 1727204662.04363: Calling all_inventory to load vars for managed-node3 41175 1727204662.04367: Calling groups_inventory to load vars for managed-node3 41175 1727204662.04370: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204662.04382: Calling all_plugins_play to load vars for managed-node3 41175 1727204662.04386: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204662.04392: Calling groups_plugins_play to load vars for managed-node3 41175 1727204662.05777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204662.07452: done with get_vars() 41175 1727204662.07474: done getting variables 41175 1727204662.07527: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the named route table 'custom' contains the specified route] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:127 Tuesday 24 September 2024 15:04:22 -0400 (0:00:00.602) 0:00:29.214 ***** 41175 1727204662.07553: entering _queue_task() for managed-node3/assert 41175 1727204662.07797: worker is 1 (out of 1 available) 41175 1727204662.07811: exiting _queue_task() for managed-node3/assert 41175 1727204662.07824: done queuing things up, now waiting for results queue to drain 41175 1727204662.07826: waiting for pending results... 41175 1727204662.08029: running TaskExecutor() for managed-node3/TASK: Assert that the named route table 'custom' contains the specified route 41175 1727204662.08105: in run() - task 12b410aa-8751-f070-39c4-0000000000ad 41175 1727204662.08118: variable 'ansible_search_path' from source: unknown 41175 1727204662.08156: calling self._execute() 41175 1727204662.08246: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204662.08253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204662.08263: variable 'omit' from source: magic vars 41175 1727204662.08595: variable 'ansible_distribution_major_version' from source: facts 41175 1727204662.08610: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204662.08614: variable 'omit' from source: magic vars 41175 1727204662.08636: variable 'omit' from source: magic vars 41175 1727204662.08669: variable 'omit' from source: magic vars 41175 1727204662.08708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204662.08743: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204662.08760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204662.08777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204662.08788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204662.08820: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204662.08824: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204662.08836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204662.08915: Set connection var ansible_shell_executable to /bin/sh 41175 1727204662.08919: Set connection var ansible_shell_type to sh 41175 1727204662.08927: Set connection var ansible_pipelining to False 41175 1727204662.08941: Set connection var ansible_timeout to 10 41175 1727204662.08945: Set connection var ansible_connection to ssh 41175 1727204662.08951: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204662.08970: variable 'ansible_shell_executable' from source: unknown 41175 1727204662.08973: variable 'ansible_connection' from source: unknown 41175 1727204662.08977: variable 'ansible_module_compression' from source: unknown 41175 1727204662.08980: variable 'ansible_shell_type' from source: unknown 41175 1727204662.08984: variable 'ansible_shell_executable' from source: unknown 41175 1727204662.08988: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204662.08995: variable 'ansible_pipelining' from source: unknown 41175 1727204662.08999: variable 'ansible_timeout' from source: unknown 41175 1727204662.09004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204662.09133: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204662.09143: variable 'omit' from source: magic vars 41175 1727204662.09152: starting attempt loop 41175 1727204662.09155: running the handler 41175 1727204662.09314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204662.09524: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204662.09561: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204662.09629: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204662.09660: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204662.09737: variable 'route_table_custom' from source: set_fact 41175 1727204662.09761: Evaluated conditional (route_table_custom.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 41175 1727204662.09878: variable 'route_table_custom' from source: set_fact 41175 1727204662.09903: Evaluated conditional (route_table_custom.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 41175 1727204662.10014: variable 'route_table_custom' from source: set_fact 41175 1727204662.10046: Evaluated conditional (route_table_custom.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 41175 1727204662.10049: handler run complete 41175 1727204662.10063: attempt loop complete, returning result 41175 1727204662.10066: _execute() done 41175 1727204662.10069: dumping result to json 41175 1727204662.10074: done dumping result, returning 41175 1727204662.10081: done running TaskExecutor() for managed-node3/TASK: Assert that the named route table 'custom' contains the specified route [12b410aa-8751-f070-39c4-0000000000ad] 41175 1727204662.10086: sending task result for task 12b410aa-8751-f070-39c4-0000000000ad 41175 1727204662.10182: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ad 41175 1727204662.10185: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41175 1727204662.10237: no more pending results, returning what we have 41175 1727204662.10242: results queue empty 41175 1727204662.10243: checking for any_errors_fatal 41175 1727204662.10255: done checking for any_errors_fatal 41175 1727204662.10255: checking for max_fail_percentage 41175 1727204662.10257: done checking for max_fail_percentage 41175 1727204662.10258: checking to see if all hosts have failed and the running result is not ok 41175 1727204662.10260: done checking to see if all hosts have failed 41175 1727204662.10261: getting the remaining hosts for this loop 41175 1727204662.10263: done getting the remaining hosts for this loop 41175 1727204662.10267: getting the next task for host managed-node3 41175 1727204662.10273: done getting next task for host managed-node3 41175 1727204662.10276: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 41175 1727204662.10279: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204662.10283: getting variables 41175 1727204662.10284: in VariableManager get_vars() 41175 1727204662.10332: Calling all_inventory to load vars for managed-node3 41175 1727204662.10335: Calling groups_inventory to load vars for managed-node3 41175 1727204662.10338: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204662.10351: Calling all_plugins_play to load vars for managed-node3 41175 1727204662.10354: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204662.10357: Calling groups_plugins_play to load vars for managed-node3 41175 1727204662.12386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204662.14064: done with get_vars() 41175 1727204662.14092: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:135 Tuesday 24 September 2024 15:04:22 -0400 (0:00:00.066) 0:00:29.280 ***** 41175 1727204662.14172: entering _queue_task() for managed-node3/file 41175 1727204662.14434: worker is 1 (out of 1 available) 41175 1727204662.14451: exiting _queue_task() for managed-node3/file 41175 1727204662.14464: done queuing things up, now waiting for results queue to drain 41175 1727204662.14466: waiting for pending results... 41175 1727204662.14662: running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 41175 1727204662.14738: in run() - task 12b410aa-8751-f070-39c4-0000000000ae 41175 1727204662.14752: variable 'ansible_search_path' from source: unknown 41175 1727204662.14783: calling self._execute() 41175 1727204662.14875: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204662.14882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204662.14896: variable 'omit' from source: magic vars 41175 1727204662.15239: variable 'ansible_distribution_major_version' from source: facts 41175 1727204662.15254: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204662.15257: variable 'omit' from source: magic vars 41175 1727204662.15275: variable 'omit' from source: magic vars 41175 1727204662.15309: variable 'omit' from source: magic vars 41175 1727204662.15347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204662.15382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204662.15402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204662.15417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204662.15431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204662.15461: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204662.15464: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204662.15467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204662.15557: Set connection var ansible_shell_executable to /bin/sh 41175 1727204662.15561: Set connection var ansible_shell_type to sh 41175 1727204662.15567: Set connection var ansible_pipelining to False 41175 1727204662.15576: Set connection var ansible_timeout to 10 41175 1727204662.15586: Set connection var ansible_connection to ssh 41175 1727204662.15591: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204662.15612: variable 'ansible_shell_executable' from source: unknown 41175 1727204662.15615: variable 'ansible_connection' from source: unknown 41175 1727204662.15622: variable 'ansible_module_compression' from source: unknown 41175 1727204662.15625: variable 'ansible_shell_type' from source: unknown 41175 1727204662.15629: variable 'ansible_shell_executable' from source: unknown 41175 1727204662.15633: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204662.15638: variable 'ansible_pipelining' from source: unknown 41175 1727204662.15642: variable 'ansible_timeout' from source: unknown 41175 1727204662.15647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204662.15827: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204662.15839: variable 'omit' from source: magic vars 41175 1727204662.15845: starting attempt loop 41175 1727204662.15848: running the handler 41175 1727204662.15862: _low_level_execute_command(): starting 41175 1727204662.15870: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204662.16427: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204662.16431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204662.16434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204662.16436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204662.16487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204662.16490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204662.16539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204662.18253: stdout chunk (state=3): >>>/root <<< 41175 1727204662.18363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204662.18420: stderr chunk (state=3): >>><<< 41175 1727204662.18424: stdout chunk (state=3): >>><<< 41175 1727204662.18448: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204662.18460: _low_level_execute_command(): starting 41175 1727204662.18466: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151 `" && echo ansible-tmp-1727204662.184473-42548-143094638843151="` echo /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151 `" ) && sleep 0' 41175 1727204662.18937: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204662.18941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204662.18943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204662.18953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204662.18956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204662.19014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204662.19017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204662.19048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204662.21009: stdout chunk (state=3): >>>ansible-tmp-1727204662.184473-42548-143094638843151=/root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151 <<< 41175 1727204662.21126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204662.21180: stderr chunk (state=3): >>><<< 41175 1727204662.21183: stdout chunk (state=3): >>><<< 41175 1727204662.21202: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204662.184473-42548-143094638843151=/root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204662.21247: variable 'ansible_module_compression' from source: unknown 41175 1727204662.21299: ANSIBALLZ: Using lock for file 41175 1727204662.21303: ANSIBALLZ: Acquiring lock 41175 1727204662.21306: ANSIBALLZ: Lock acquired: 140088839297584 41175 1727204662.21311: ANSIBALLZ: Creating module 41175 1727204662.38337: ANSIBALLZ: Writing module into payload 41175 1727204662.38538: ANSIBALLZ: Writing module 41175 1727204662.38721: ANSIBALLZ: Renaming module 41175 1727204662.38724: ANSIBALLZ: Done creating module 41175 1727204662.38728: variable 'ansible_facts' from source: unknown 41175 1727204662.38730: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/AnsiballZ_file.py 41175 1727204662.39368: Sending initial data 41175 1727204662.39371: Sent initial data (152 bytes) 41175 1727204662.40969: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204662.41003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204662.41039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204662.41058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204662.41181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204662.41233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204662.42970: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204662.43011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204662.43060: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpb4y2fnxt /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/AnsiballZ_file.py <<< 41175 1727204662.43064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/AnsiballZ_file.py" <<< 41175 1727204662.43395: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpb4y2fnxt" to remote "/root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/AnsiballZ_file.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/AnsiballZ_file.py" <<< 41175 1727204662.44702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204662.44715: stdout chunk (state=3): >>><<< 41175 1727204662.44733: stderr chunk (state=3): >>><<< 41175 1727204662.44767: done transferring module to remote 41175 1727204662.44812: _low_level_execute_command(): starting 41175 1727204662.44981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/ /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/AnsiballZ_file.py && sleep 0' 41175 1727204662.45787: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204662.45929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204662.45949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204662.46037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204662.46076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204662.46092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204662.46143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204662.48043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204662.48073: stderr chunk (state=3): >>><<< 41175 1727204662.48077: stdout chunk (state=3): >>><<< 41175 1727204662.48140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204662.48144: _low_level_execute_command(): starting 41175 1727204662.48147: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/AnsiballZ_file.py && sleep 0' 41175 1727204662.48942: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204662.48998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204662.49013: stderr chunk (state=3): >>>debug2: match found <<< 41175 1727204662.49089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204662.49120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204662.49144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204662.49226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204662.67241: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 41175 1727204662.69122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204662.69126: stdout chunk (state=3): >>><<< 41175 1727204662.69129: stderr chunk (state=3): >>><<< 41175 1727204662.69404: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204662.69409: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204662.69413: _low_level_execute_command(): starting 41175 1727204662.69415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204662.184473-42548-143094638843151/ > /dev/null 2>&1 && sleep 0' 41175 1727204662.70554: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204662.70573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204662.70605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204662.70712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204662.70754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204662.70774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204662.70833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204662.70999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204662.73029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204662.73295: stdout chunk (state=3): >>><<< 41175 1727204662.73298: stderr chunk (state=3): >>><<< 41175 1727204662.73301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204662.73304: handler run complete 41175 1727204662.73306: attempt loop complete, returning result 41175 1727204662.73308: _execute() done 41175 1727204662.73310: dumping result to json 41175 1727204662.73311: done dumping result, returning 41175 1727204662.73313: done running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [12b410aa-8751-f070-39c4-0000000000ae] 41175 1727204662.73315: sending task result for task 12b410aa-8751-f070-39c4-0000000000ae 41175 1727204662.73400: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ae 41175 1727204662.73403: WORKER PROCESS EXITING changed: [managed-node3] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 41175 1727204662.73595: no more pending results, returning what we have 41175 1727204662.73601: results queue empty 41175 1727204662.73602: checking for any_errors_fatal 41175 1727204662.73612: done checking for any_errors_fatal 41175 1727204662.73613: checking for max_fail_percentage 41175 1727204662.73615: done checking for max_fail_percentage 41175 1727204662.73616: checking to see if all hosts have failed and the running result is not ok 41175 1727204662.73620: done checking to see if all hosts have failed 41175 1727204662.73621: getting the remaining hosts for this loop 41175 1727204662.73623: done getting the remaining hosts for this loop 41175 1727204662.73628: getting the next task for host managed-node3 41175 1727204662.73636: done getting next task for host managed-node3 41175 1727204662.73639: ^ task is: TASK: meta (flush_handlers) 41175 1727204662.73641: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204662.73646: getting variables 41175 1727204662.73648: in VariableManager get_vars() 41175 1727204662.73836: Calling all_inventory to load vars for managed-node3 41175 1727204662.73839: Calling groups_inventory to load vars for managed-node3 41175 1727204662.73842: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204662.73856: Calling all_plugins_play to load vars for managed-node3 41175 1727204662.73859: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204662.73862: Calling groups_plugins_play to load vars for managed-node3 41175 1727204662.76752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204662.80108: done with get_vars() 41175 1727204662.80153: done getting variables 41175 1727204662.80240: in VariableManager get_vars() 41175 1727204662.80260: Calling all_inventory to load vars for managed-node3 41175 1727204662.80263: Calling groups_inventory to load vars for managed-node3 41175 1727204662.80266: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204662.80273: Calling all_plugins_play to load vars for managed-node3 41175 1727204662.80276: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204662.80280: Calling groups_plugins_play to load vars for managed-node3 41175 1727204662.82275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204662.87382: done with get_vars() 41175 1727204662.87428: done queuing things up, now waiting for results queue to drain 41175 1727204662.87431: results queue empty 41175 1727204662.87432: checking for any_errors_fatal 41175 1727204662.87437: done checking for any_errors_fatal 41175 1727204662.87439: checking for max_fail_percentage 41175 1727204662.87440: done checking for max_fail_percentage 41175 1727204662.87441: checking to see if all hosts have failed and the running result is not ok 41175 1727204662.87442: done checking to see if all hosts have failed 41175 1727204662.87443: getting the remaining hosts for this loop 41175 1727204662.87444: done getting the remaining hosts for this loop 41175 1727204662.87448: getting the next task for host managed-node3 41175 1727204662.87453: done getting next task for host managed-node3 41175 1727204662.87455: ^ task is: TASK: meta (flush_handlers) 41175 1727204662.87457: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204662.87462: getting variables 41175 1727204662.87463: in VariableManager get_vars() 41175 1727204662.87481: Calling all_inventory to load vars for managed-node3 41175 1727204662.87484: Calling groups_inventory to load vars for managed-node3 41175 1727204662.87487: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204662.87519: Calling all_plugins_play to load vars for managed-node3 41175 1727204662.87524: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204662.87528: Calling groups_plugins_play to load vars for managed-node3 41175 1727204662.89458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204662.92300: done with get_vars() 41175 1727204662.92337: done getting variables 41175 1727204662.92404: in VariableManager get_vars() 41175 1727204662.92424: Calling all_inventory to load vars for managed-node3 41175 1727204662.92427: Calling groups_inventory to load vars for managed-node3 41175 1727204662.92430: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204662.92436: Calling all_plugins_play to load vars for managed-node3 41175 1727204662.92439: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204662.92443: Calling groups_plugins_play to load vars for managed-node3 41175 1727204662.93664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204662.96093: done with get_vars() 41175 1727204662.96134: done queuing things up, now waiting for results queue to drain 41175 1727204662.96141: results queue empty 41175 1727204662.96142: checking for any_errors_fatal 41175 1727204662.96144: done checking for any_errors_fatal 41175 1727204662.96145: checking for max_fail_percentage 41175 1727204662.96146: done checking for max_fail_percentage 41175 1727204662.96147: checking to see if all hosts have failed and the running result is not ok 41175 1727204662.96148: done checking to see if all hosts have failed 41175 1727204662.96149: getting the remaining hosts for this loop 41175 1727204662.96150: done getting the remaining hosts for this loop 41175 1727204662.96153: getting the next task for host managed-node3 41175 1727204662.96157: done getting next task for host managed-node3 41175 1727204662.96158: ^ task is: None 41175 1727204662.96160: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204662.96161: done queuing things up, now waiting for results queue to drain 41175 1727204662.96162: results queue empty 41175 1727204662.96163: checking for any_errors_fatal 41175 1727204662.96163: done checking for any_errors_fatal 41175 1727204662.96164: checking for max_fail_percentage 41175 1727204662.96165: done checking for max_fail_percentage 41175 1727204662.96166: checking to see if all hosts have failed and the running result is not ok 41175 1727204662.96167: done checking to see if all hosts have failed 41175 1727204662.96168: getting the next task for host managed-node3 41175 1727204662.96171: done getting next task for host managed-node3 41175 1727204662.96172: ^ task is: None 41175 1727204662.96173: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204662.96243: in VariableManager get_vars() 41175 1727204662.96271: done with get_vars() 41175 1727204662.96278: in VariableManager get_vars() 41175 1727204662.96297: done with get_vars() 41175 1727204662.96303: variable 'omit' from source: magic vars 41175 1727204662.96451: variable 'profile' from source: play vars 41175 1727204662.96570: in VariableManager get_vars() 41175 1727204662.96595: done with get_vars() 41175 1727204662.96621: variable 'omit' from source: magic vars 41175 1727204662.96710: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 41175 1727204662.97572: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41175 1727204662.97598: getting the remaining hosts for this loop 41175 1727204662.97600: done getting the remaining hosts for this loop 41175 1727204662.97603: getting the next task for host managed-node3 41175 1727204662.97606: done getting next task for host managed-node3 41175 1727204662.97608: ^ task is: TASK: Gathering Facts 41175 1727204662.97610: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204662.97612: getting variables 41175 1727204662.97613: in VariableManager get_vars() 41175 1727204662.97712: Calling all_inventory to load vars for managed-node3 41175 1727204662.97715: Calling groups_inventory to load vars for managed-node3 41175 1727204662.97718: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204662.97725: Calling all_plugins_play to load vars for managed-node3 41175 1727204662.97728: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204662.97732: Calling groups_plugins_play to load vars for managed-node3 41175 1727204662.98973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204663.01098: done with get_vars() 41175 1727204663.01124: done getting variables 41175 1727204663.01164: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 15:04:23 -0400 (0:00:00.870) 0:00:30.150 ***** 41175 1727204663.01193: entering _queue_task() for managed-node3/gather_facts 41175 1727204663.01477: worker is 1 (out of 1 available) 41175 1727204663.01492: exiting _queue_task() for managed-node3/gather_facts 41175 1727204663.01505: done queuing things up, now waiting for results queue to drain 41175 1727204663.01507: waiting for pending results... 41175 1727204663.01715: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41175 1727204663.01798: in run() - task 12b410aa-8751-f070-39c4-0000000006a2 41175 1727204663.01813: variable 'ansible_search_path' from source: unknown 41175 1727204663.01851: calling self._execute() 41175 1727204663.01947: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204663.01953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204663.01965: variable 'omit' from source: magic vars 41175 1727204663.02304: variable 'ansible_distribution_major_version' from source: facts 41175 1727204663.02316: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204663.02326: variable 'omit' from source: magic vars 41175 1727204663.02351: variable 'omit' from source: magic vars 41175 1727204663.02405: variable 'omit' from source: magic vars 41175 1727204663.02428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204663.02459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204663.02479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204663.02497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204663.02510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204663.02543: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204663.02547: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204663.02549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204663.02641: Set connection var ansible_shell_executable to /bin/sh 41175 1727204663.02645: Set connection var ansible_shell_type to sh 41175 1727204663.02651: Set connection var ansible_pipelining to False 41175 1727204663.02660: Set connection var ansible_timeout to 10 41175 1727204663.02666: Set connection var ansible_connection to ssh 41175 1727204663.02673: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204663.02703: variable 'ansible_shell_executable' from source: unknown 41175 1727204663.02707: variable 'ansible_connection' from source: unknown 41175 1727204663.02710: variable 'ansible_module_compression' from source: unknown 41175 1727204663.02713: variable 'ansible_shell_type' from source: unknown 41175 1727204663.02717: variable 'ansible_shell_executable' from source: unknown 41175 1727204663.02726: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204663.02730: variable 'ansible_pipelining' from source: unknown 41175 1727204663.02734: variable 'ansible_timeout' from source: unknown 41175 1727204663.02741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204663.02908: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204663.02928: variable 'omit' from source: magic vars 41175 1727204663.02932: starting attempt loop 41175 1727204663.02935: running the handler 41175 1727204663.02949: variable 'ansible_facts' from source: unknown 41175 1727204663.02970: _low_level_execute_command(): starting 41175 1727204663.02977: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204663.03649: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204663.03655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204663.03658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.03749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204663.03859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204663.03866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204663.05635: stdout chunk (state=3): >>>/root <<< 41175 1727204663.05710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204663.05824: stderr chunk (state=3): >>><<< 41175 1727204663.05831: stdout chunk (state=3): >>><<< 41175 1727204663.06015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204663.06024: _low_level_execute_command(): starting 41175 1727204663.06027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692 `" && echo ansible-tmp-1727204663.0587711-42683-142606819943692="` echo /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692 `" ) && sleep 0' 41175 1727204663.07080: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204663.07109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204663.07142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204663.07169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204663.07219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204663.07246: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204663.07315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.07454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204663.07457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204663.07544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204663.07549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204663.09546: stdout chunk (state=3): >>>ansible-tmp-1727204663.0587711-42683-142606819943692=/root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692 <<< 41175 1727204663.09752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204663.09756: stdout chunk (state=3): >>><<< 41175 1727204663.09758: stderr chunk (state=3): >>><<< 41175 1727204663.09870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204663.0587711-42683-142606819943692=/root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204663.09874: variable 'ansible_module_compression' from source: unknown 41175 1727204663.09877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41175 1727204663.10008: variable 'ansible_facts' from source: unknown 41175 1727204663.10151: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/AnsiballZ_setup.py 41175 1727204663.10354: Sending initial data 41175 1727204663.10365: Sent initial data (154 bytes) 41175 1727204663.11111: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204663.11227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.11243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204663.11267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204663.11341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204663.12963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204663.13033: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204663.13053: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp26kq4bkp /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/AnsiballZ_setup.py <<< 41175 1727204663.13062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/AnsiballZ_setup.py" <<< 41175 1727204663.13103: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp26kq4bkp" to remote "/root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/AnsiballZ_setup.py" <<< 41175 1727204663.13107: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/AnsiballZ_setup.py" <<< 41175 1727204663.15225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204663.15383: stderr chunk (state=3): >>><<< 41175 1727204663.15386: stdout chunk (state=3): >>><<< 41175 1727204663.15391: done transferring module to remote 41175 1727204663.15393: _low_level_execute_command(): starting 41175 1727204663.15396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/ /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/AnsiballZ_setup.py && sleep 0' 41175 1727204663.15760: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204663.15774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.15787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.15844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204663.15881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204663.15898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204663.17743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204663.17790: stderr chunk (state=3): >>><<< 41175 1727204663.17795: stdout chunk (state=3): >>><<< 41175 1727204663.17810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204663.17819: _low_level_execute_command(): starting 41175 1727204663.17822: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/AnsiballZ_setup.py && sleep 0' 41175 1727204663.18251: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204663.18287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204663.18292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.18294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204663.18297: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204663.18299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.18352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204663.18357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204663.18402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204663.89665: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 1.1796875, "5m": 0.9033203125, "15m": 0.5458984375}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2843, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 874, "free": 2843}, "nocache": {"free": 3473, "used": 244}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1167, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148648448, "block_size": 4096, "block_total": 64479564, "block_available": 61315588, "block_used": 3163976, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::3cf7:5250:7aab:a9e0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "fe80::3cf7:5250:7aab:a9e0", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94", "fe80::3cf7:5250:7aab:a9e0", "fe80::b08f:9ff:fe74:fbc0"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "23", "epoch": "1727204663", "epoch_int": "1727204663", "date": "2024-09-24", "time": "15:04:23", "iso8601_micro": "2024-09-24T19:04:23.891188Z", "iso8601": "2024-09-24T19:04:23Z", "iso8601_basic": "20240924T150423891188", "iso8601_basic_short": "20240924T150423", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41175 1727204663.91817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204663.91839: stderr chunk (state=3): >>><<< 41175 1727204663.91851: stdout chunk (state=3): >>><<< 41175 1727204663.91907: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 1.1796875, "5m": 0.9033203125, "15m": 0.5458984375}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2843, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 874, "free": 2843}, "nocache": {"free": 3473, "used": 244}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1167, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148648448, "block_size": 4096, "block_total": 64479564, "block_available": 61315588, "block_used": 3163976, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::3cf7:5250:7aab:a9e0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "fe80::3cf7:5250:7aab:a9e0", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94", "fe80::3cf7:5250:7aab:a9e0", "fe80::b08f:9ff:fe74:fbc0"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "23", "epoch": "1727204663", "epoch_int": "1727204663", "date": "2024-09-24", "time": "15:04:23", "iso8601_micro": "2024-09-24T19:04:23.891188Z", "iso8601": "2024-09-24T19:04:23Z", "iso8601_basic": "20240924T150423891188", "iso8601_basic_short": "20240924T150423", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204663.92677: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204663.92715: _low_level_execute_command(): starting 41175 1727204663.92730: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204663.0587711-42683-142606819943692/ > /dev/null 2>&1 && sleep 0' 41175 1727204663.93495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204663.93526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204663.93545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204663.93568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204663.93637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204663.95621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204663.95706: stderr chunk (state=3): >>><<< 41175 1727204663.95720: stdout chunk (state=3): >>><<< 41175 1727204663.95906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204663.95910: handler run complete 41175 1727204663.96072: variable 'ansible_facts' from source: unknown 41175 1727204663.96267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204663.96985: variable 'ansible_facts' from source: unknown 41175 1727204663.97202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204663.97591: attempt loop complete, returning result 41175 1727204663.97630: _execute() done 41175 1727204663.97724: dumping result to json 41175 1727204663.98008: done dumping result, returning 41175 1727204663.98074: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-f070-39c4-0000000006a2] 41175 1727204663.98077: sending task result for task 12b410aa-8751-f070-39c4-0000000006a2 ok: [managed-node3] 41175 1727204663.99317: no more pending results, returning what we have 41175 1727204663.99322: results queue empty 41175 1727204663.99323: checking for any_errors_fatal 41175 1727204663.99324: done checking for any_errors_fatal 41175 1727204663.99325: checking for max_fail_percentage 41175 1727204663.99327: done checking for max_fail_percentage 41175 1727204663.99328: checking to see if all hosts have failed and the running result is not ok 41175 1727204663.99329: done checking to see if all hosts have failed 41175 1727204663.99330: getting the remaining hosts for this loop 41175 1727204663.99332: done getting the remaining hosts for this loop 41175 1727204663.99336: getting the next task for host managed-node3 41175 1727204663.99342: done getting next task for host managed-node3 41175 1727204663.99344: ^ task is: TASK: meta (flush_handlers) 41175 1727204663.99346: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204663.99351: getting variables 41175 1727204663.99352: in VariableManager get_vars() 41175 1727204663.99383: Calling all_inventory to load vars for managed-node3 41175 1727204663.99386: Calling groups_inventory to load vars for managed-node3 41175 1727204663.99392: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204663.99399: done sending task result for task 12b410aa-8751-f070-39c4-0000000006a2 41175 1727204663.99402: WORKER PROCESS EXITING 41175 1727204663.99414: Calling all_plugins_play to load vars for managed-node3 41175 1727204663.99418: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204663.99422: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.01578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.04746: done with get_vars() 41175 1727204664.04785: done getting variables 41175 1727204664.04868: in VariableManager get_vars() 41175 1727204664.04886: Calling all_inventory to load vars for managed-node3 41175 1727204664.04892: Calling groups_inventory to load vars for managed-node3 41175 1727204664.04895: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204664.04901: Calling all_plugins_play to load vars for managed-node3 41175 1727204664.04904: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204664.04908: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.06916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.09953: done with get_vars() 41175 1727204664.09998: done queuing things up, now waiting for results queue to drain 41175 1727204664.10001: results queue empty 41175 1727204664.10002: checking for any_errors_fatal 41175 1727204664.10008: done checking for any_errors_fatal 41175 1727204664.10009: checking for max_fail_percentage 41175 1727204664.10010: done checking for max_fail_percentage 41175 1727204664.10011: checking to see if all hosts have failed and the running result is not ok 41175 1727204664.10012: done checking to see if all hosts have failed 41175 1727204664.10013: getting the remaining hosts for this loop 41175 1727204664.10019: done getting the remaining hosts for this loop 41175 1727204664.10023: getting the next task for host managed-node3 41175 1727204664.10028: done getting next task for host managed-node3 41175 1727204664.10031: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204664.10033: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204664.10046: getting variables 41175 1727204664.10047: in VariableManager get_vars() 41175 1727204664.10064: Calling all_inventory to load vars for managed-node3 41175 1727204664.10067: Calling groups_inventory to load vars for managed-node3 41175 1727204664.10070: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204664.10076: Calling all_plugins_play to load vars for managed-node3 41175 1727204664.10079: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204664.10082: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.12156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.15170: done with get_vars() 41175 1727204664.15205: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:24 -0400 (0:00:01.141) 0:00:31.291 ***** 41175 1727204664.15300: entering _queue_task() for managed-node3/include_tasks 41175 1727204664.15671: worker is 1 (out of 1 available) 41175 1727204664.15685: exiting _queue_task() for managed-node3/include_tasks 41175 1727204664.15699: done queuing things up, now waiting for results queue to drain 41175 1727204664.15701: waiting for pending results... 41175 1727204664.16111: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204664.16136: in run() - task 12b410aa-8751-f070-39c4-0000000000b7 41175 1727204664.16160: variable 'ansible_search_path' from source: unknown 41175 1727204664.16168: variable 'ansible_search_path' from source: unknown 41175 1727204664.16217: calling self._execute() 41175 1727204664.16333: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204664.16348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204664.16366: variable 'omit' from source: magic vars 41175 1727204664.16818: variable 'ansible_distribution_major_version' from source: facts 41175 1727204664.16838: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204664.16851: _execute() done 41175 1727204664.16865: dumping result to json 41175 1727204664.16874: done dumping result, returning 41175 1727204664.16887: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-f070-39c4-0000000000b7] 41175 1727204664.16903: sending task result for task 12b410aa-8751-f070-39c4-0000000000b7 41175 1727204664.17184: no more pending results, returning what we have 41175 1727204664.17193: in VariableManager get_vars() 41175 1727204664.17243: Calling all_inventory to load vars for managed-node3 41175 1727204664.17247: Calling groups_inventory to load vars for managed-node3 41175 1727204664.17250: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204664.17267: Calling all_plugins_play to load vars for managed-node3 41175 1727204664.17271: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204664.17275: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.17806: done sending task result for task 12b410aa-8751-f070-39c4-0000000000b7 41175 1727204664.17809: WORKER PROCESS EXITING 41175 1727204664.19756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.27636: done with get_vars() 41175 1727204664.27658: variable 'ansible_search_path' from source: unknown 41175 1727204664.27659: variable 'ansible_search_path' from source: unknown 41175 1727204664.27683: we have included files to process 41175 1727204664.27683: generating all_blocks data 41175 1727204664.27684: done generating all_blocks data 41175 1727204664.27685: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204664.27686: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204664.27687: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204664.28139: done processing included file 41175 1727204664.28140: iterating over new_blocks loaded from include file 41175 1727204664.28142: in VariableManager get_vars() 41175 1727204664.28157: done with get_vars() 41175 1727204664.28159: filtering new block on tags 41175 1727204664.28171: done filtering new block on tags 41175 1727204664.28173: in VariableManager get_vars() 41175 1727204664.28188: done with get_vars() 41175 1727204664.28191: filtering new block on tags 41175 1727204664.28207: done filtering new block on tags 41175 1727204664.28209: in VariableManager get_vars() 41175 1727204664.28224: done with get_vars() 41175 1727204664.28225: filtering new block on tags 41175 1727204664.28237: done filtering new block on tags 41175 1727204664.28239: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41175 1727204664.28242: extending task lists for all hosts with included blocks 41175 1727204664.28572: done extending task lists 41175 1727204664.28574: done processing included files 41175 1727204664.28575: results queue empty 41175 1727204664.28575: checking for any_errors_fatal 41175 1727204664.28577: done checking for any_errors_fatal 41175 1727204664.28578: checking for max_fail_percentage 41175 1727204664.28579: done checking for max_fail_percentage 41175 1727204664.28580: checking to see if all hosts have failed and the running result is not ok 41175 1727204664.28581: done checking to see if all hosts have failed 41175 1727204664.28582: getting the remaining hosts for this loop 41175 1727204664.28584: done getting the remaining hosts for this loop 41175 1727204664.28586: getting the next task for host managed-node3 41175 1727204664.28592: done getting next task for host managed-node3 41175 1727204664.28595: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204664.28598: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204664.28607: getting variables 41175 1727204664.28608: in VariableManager get_vars() 41175 1727204664.28622: Calling all_inventory to load vars for managed-node3 41175 1727204664.28625: Calling groups_inventory to load vars for managed-node3 41175 1727204664.28628: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204664.28633: Calling all_plugins_play to load vars for managed-node3 41175 1727204664.28636: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204664.28639: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.30220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.31820: done with get_vars() 41175 1727204664.31855: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:24 -0400 (0:00:00.166) 0:00:31.458 ***** 41175 1727204664.31919: entering _queue_task() for managed-node3/setup 41175 1727204664.32303: worker is 1 (out of 1 available) 41175 1727204664.32317: exiting _queue_task() for managed-node3/setup 41175 1727204664.32330: done queuing things up, now waiting for results queue to drain 41175 1727204664.32331: waiting for pending results... 41175 1727204664.32647: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204664.32821: in run() - task 12b410aa-8751-f070-39c4-0000000006e3 41175 1727204664.32848: variable 'ansible_search_path' from source: unknown 41175 1727204664.32858: variable 'ansible_search_path' from source: unknown 41175 1727204664.32911: calling self._execute() 41175 1727204664.33037: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204664.33052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204664.33070: variable 'omit' from source: magic vars 41175 1727204664.33543: variable 'ansible_distribution_major_version' from source: facts 41175 1727204664.33566: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204664.33801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204664.35895: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204664.35901: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204664.35904: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204664.35925: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204664.35961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204664.36057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204664.36101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204664.36139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204664.36202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204664.36225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204664.36299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204664.36333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204664.36371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204664.36432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204664.36456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204664.36645: variable '__network_required_facts' from source: role '' defaults 41175 1727204664.36666: variable 'ansible_facts' from source: unknown 41175 1727204664.37745: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41175 1727204664.37754: when evaluation is False, skipping this task 41175 1727204664.37882: _execute() done 41175 1727204664.37892: dumping result to json 41175 1727204664.37895: done dumping result, returning 41175 1727204664.37897: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-f070-39c4-0000000006e3] 41175 1727204664.37900: sending task result for task 12b410aa-8751-f070-39c4-0000000006e3 41175 1727204664.37975: done sending task result for task 12b410aa-8751-f070-39c4-0000000006e3 41175 1727204664.37978: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204664.38044: no more pending results, returning what we have 41175 1727204664.38048: results queue empty 41175 1727204664.38049: checking for any_errors_fatal 41175 1727204664.38051: done checking for any_errors_fatal 41175 1727204664.38052: checking for max_fail_percentage 41175 1727204664.38053: done checking for max_fail_percentage 41175 1727204664.38055: checking to see if all hosts have failed and the running result is not ok 41175 1727204664.38056: done checking to see if all hosts have failed 41175 1727204664.38057: getting the remaining hosts for this loop 41175 1727204664.38058: done getting the remaining hosts for this loop 41175 1727204664.38062: getting the next task for host managed-node3 41175 1727204664.38071: done getting next task for host managed-node3 41175 1727204664.38075: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204664.38078: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204664.38095: getting variables 41175 1727204664.38097: in VariableManager get_vars() 41175 1727204664.38144: Calling all_inventory to load vars for managed-node3 41175 1727204664.38148: Calling groups_inventory to load vars for managed-node3 41175 1727204664.38151: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204664.38162: Calling all_plugins_play to load vars for managed-node3 41175 1727204664.38165: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204664.38169: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.41262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.44545: done with get_vars() 41175 1727204664.44579: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:24 -0400 (0:00:00.127) 0:00:31.585 ***** 41175 1727204664.44663: entering _queue_task() for managed-node3/stat 41175 1727204664.44941: worker is 1 (out of 1 available) 41175 1727204664.44956: exiting _queue_task() for managed-node3/stat 41175 1727204664.44968: done queuing things up, now waiting for results queue to drain 41175 1727204664.44970: waiting for pending results... 41175 1727204664.45159: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204664.45265: in run() - task 12b410aa-8751-f070-39c4-0000000006e5 41175 1727204664.45278: variable 'ansible_search_path' from source: unknown 41175 1727204664.45283: variable 'ansible_search_path' from source: unknown 41175 1727204664.45323: calling self._execute() 41175 1727204664.45412: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204664.45419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204664.45432: variable 'omit' from source: magic vars 41175 1727204664.45761: variable 'ansible_distribution_major_version' from source: facts 41175 1727204664.45766: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204664.45909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204664.46134: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204664.46171: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204664.46208: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204664.46240: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204664.46347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204664.46369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204664.46394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204664.46422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204664.46494: variable '__network_is_ostree' from source: set_fact 41175 1727204664.46501: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204664.46505: when evaluation is False, skipping this task 41175 1727204664.46508: _execute() done 41175 1727204664.46513: dumping result to json 41175 1727204664.46520: done dumping result, returning 41175 1727204664.46526: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-f070-39c4-0000000006e5] 41175 1727204664.46537: sending task result for task 12b410aa-8751-f070-39c4-0000000006e5 41175 1727204664.46628: done sending task result for task 12b410aa-8751-f070-39c4-0000000006e5 41175 1727204664.46630: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204664.46692: no more pending results, returning what we have 41175 1727204664.46696: results queue empty 41175 1727204664.46698: checking for any_errors_fatal 41175 1727204664.46705: done checking for any_errors_fatal 41175 1727204664.46706: checking for max_fail_percentage 41175 1727204664.46707: done checking for max_fail_percentage 41175 1727204664.46709: checking to see if all hosts have failed and the running result is not ok 41175 1727204664.46710: done checking to see if all hosts have failed 41175 1727204664.46711: getting the remaining hosts for this loop 41175 1727204664.46713: done getting the remaining hosts for this loop 41175 1727204664.46720: getting the next task for host managed-node3 41175 1727204664.46726: done getting next task for host managed-node3 41175 1727204664.46730: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204664.46733: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204664.46747: getting variables 41175 1727204664.46750: in VariableManager get_vars() 41175 1727204664.46787: Calling all_inventory to load vars for managed-node3 41175 1727204664.46797: Calling groups_inventory to load vars for managed-node3 41175 1727204664.46801: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204664.46811: Calling all_plugins_play to load vars for managed-node3 41175 1727204664.46815: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204664.46821: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.48173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.49800: done with get_vars() 41175 1727204664.49825: done getting variables 41175 1727204664.49874: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:24 -0400 (0:00:00.052) 0:00:31.637 ***** 41175 1727204664.49904: entering _queue_task() for managed-node3/set_fact 41175 1727204664.50159: worker is 1 (out of 1 available) 41175 1727204664.50174: exiting _queue_task() for managed-node3/set_fact 41175 1727204664.50184: done queuing things up, now waiting for results queue to drain 41175 1727204664.50186: waiting for pending results... 41175 1727204664.50374: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204664.50475: in run() - task 12b410aa-8751-f070-39c4-0000000006e6 41175 1727204664.50487: variable 'ansible_search_path' from source: unknown 41175 1727204664.50493: variable 'ansible_search_path' from source: unknown 41175 1727204664.50526: calling self._execute() 41175 1727204664.50611: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204664.50621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204664.50631: variable 'omit' from source: magic vars 41175 1727204664.50968: variable 'ansible_distribution_major_version' from source: facts 41175 1727204664.50981: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204664.51131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204664.51355: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204664.51392: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204664.51427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204664.51459: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204664.51566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204664.51588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204664.51611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204664.51639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204664.51711: variable '__network_is_ostree' from source: set_fact 41175 1727204664.51719: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204664.51727: when evaluation is False, skipping this task 41175 1727204664.51731: _execute() done 41175 1727204664.51734: dumping result to json 41175 1727204664.51746: done dumping result, returning 41175 1727204664.51750: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-f070-39c4-0000000006e6] 41175 1727204664.51752: sending task result for task 12b410aa-8751-f070-39c4-0000000006e6 41175 1727204664.51839: done sending task result for task 12b410aa-8751-f070-39c4-0000000006e6 41175 1727204664.51842: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204664.51896: no more pending results, returning what we have 41175 1727204664.51900: results queue empty 41175 1727204664.51902: checking for any_errors_fatal 41175 1727204664.51908: done checking for any_errors_fatal 41175 1727204664.51909: checking for max_fail_percentage 41175 1727204664.51911: done checking for max_fail_percentage 41175 1727204664.51912: checking to see if all hosts have failed and the running result is not ok 41175 1727204664.51913: done checking to see if all hosts have failed 41175 1727204664.51914: getting the remaining hosts for this loop 41175 1727204664.51916: done getting the remaining hosts for this loop 41175 1727204664.51920: getting the next task for host managed-node3 41175 1727204664.51930: done getting next task for host managed-node3 41175 1727204664.51934: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204664.51937: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204664.51951: getting variables 41175 1727204664.51954: in VariableManager get_vars() 41175 1727204664.51991: Calling all_inventory to load vars for managed-node3 41175 1727204664.51994: Calling groups_inventory to load vars for managed-node3 41175 1727204664.51996: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204664.52007: Calling all_plugins_play to load vars for managed-node3 41175 1727204664.52010: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204664.52013: Calling groups_plugins_play to load vars for managed-node3 41175 1727204664.53251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204664.54875: done with get_vars() 41175 1727204664.54904: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:24 -0400 (0:00:00.050) 0:00:31.688 ***** 41175 1727204664.54982: entering _queue_task() for managed-node3/service_facts 41175 1727204664.55250: worker is 1 (out of 1 available) 41175 1727204664.55264: exiting _queue_task() for managed-node3/service_facts 41175 1727204664.55277: done queuing things up, now waiting for results queue to drain 41175 1727204664.55279: waiting for pending results... 41175 1727204664.55480: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204664.55576: in run() - task 12b410aa-8751-f070-39c4-0000000006e8 41175 1727204664.55591: variable 'ansible_search_path' from source: unknown 41175 1727204664.55594: variable 'ansible_search_path' from source: unknown 41175 1727204664.55631: calling self._execute() 41175 1727204664.55711: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204664.55718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204664.55734: variable 'omit' from source: magic vars 41175 1727204664.56058: variable 'ansible_distribution_major_version' from source: facts 41175 1727204664.56071: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204664.56078: variable 'omit' from source: magic vars 41175 1727204664.56128: variable 'omit' from source: magic vars 41175 1727204664.56160: variable 'omit' from source: magic vars 41175 1727204664.56198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204664.56230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204664.56248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204664.56264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204664.56277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204664.56309: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204664.56313: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204664.56318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204664.56407: Set connection var ansible_shell_executable to /bin/sh 41175 1727204664.56411: Set connection var ansible_shell_type to sh 41175 1727204664.56416: Set connection var ansible_pipelining to False 41175 1727204664.56428: Set connection var ansible_timeout to 10 41175 1727204664.56434: Set connection var ansible_connection to ssh 41175 1727204664.56440: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204664.56459: variable 'ansible_shell_executable' from source: unknown 41175 1727204664.56462: variable 'ansible_connection' from source: unknown 41175 1727204664.56465: variable 'ansible_module_compression' from source: unknown 41175 1727204664.56468: variable 'ansible_shell_type' from source: unknown 41175 1727204664.56473: variable 'ansible_shell_executable' from source: unknown 41175 1727204664.56477: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204664.56482: variable 'ansible_pipelining' from source: unknown 41175 1727204664.56485: variable 'ansible_timeout' from source: unknown 41175 1727204664.56491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204664.56665: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204664.56675: variable 'omit' from source: magic vars 41175 1727204664.56681: starting attempt loop 41175 1727204664.56684: running the handler 41175 1727204664.56699: _low_level_execute_command(): starting 41175 1727204664.56707: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204664.57260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204664.57266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.57269: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204664.57272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.57333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204664.57336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204664.57342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204664.57388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204664.59135: stdout chunk (state=3): >>>/root <<< 41175 1727204664.59251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204664.59301: stderr chunk (state=3): >>><<< 41175 1727204664.59305: stdout chunk (state=3): >>><<< 41175 1727204664.59330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204664.59341: _low_level_execute_command(): starting 41175 1727204664.59347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349 `" && echo ansible-tmp-1727204664.5932615-42769-56983881964349="` echo /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349 `" ) && sleep 0' 41175 1727204664.59778: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204664.59795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.59798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204664.59812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204664.59828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.59884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204664.59888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204664.59894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204664.59933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204664.61951: stdout chunk (state=3): >>>ansible-tmp-1727204664.5932615-42769-56983881964349=/root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349 <<< 41175 1727204664.62067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204664.62112: stderr chunk (state=3): >>><<< 41175 1727204664.62116: stdout chunk (state=3): >>><<< 41175 1727204664.62135: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204664.5932615-42769-56983881964349=/root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204664.62173: variable 'ansible_module_compression' from source: unknown 41175 1727204664.62210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41175 1727204664.62252: variable 'ansible_facts' from source: unknown 41175 1727204664.62306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/AnsiballZ_service_facts.py 41175 1727204664.62422: Sending initial data 41175 1727204664.62425: Sent initial data (161 bytes) 41175 1727204664.62856: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204664.62895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204664.62898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.62901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204664.62903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204664.62905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.62957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204664.62960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204664.63003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204664.64648: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41175 1727204664.64657: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204664.64683: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204664.64719: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmphxx1rl3_ /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/AnsiballZ_service_facts.py <<< 41175 1727204664.64727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/AnsiballZ_service_facts.py" <<< 41175 1727204664.64757: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmphxx1rl3_" to remote "/root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/AnsiballZ_service_facts.py" <<< 41175 1727204664.64764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/AnsiballZ_service_facts.py" <<< 41175 1727204664.65552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204664.65614: stderr chunk (state=3): >>><<< 41175 1727204664.65621: stdout chunk (state=3): >>><<< 41175 1727204664.65639: done transferring module to remote 41175 1727204664.65649: _low_level_execute_command(): starting 41175 1727204664.65654: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/ /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/AnsiballZ_service_facts.py && sleep 0' 41175 1727204664.66093: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204664.66096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.66099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204664.66101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.66160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204664.66164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204664.66199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204664.68037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204664.68080: stderr chunk (state=3): >>><<< 41175 1727204664.68084: stdout chunk (state=3): >>><<< 41175 1727204664.68103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204664.68107: _low_level_execute_command(): starting 41175 1727204664.68109: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/AnsiballZ_service_facts.py && sleep 0' 41175 1727204664.68552: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204664.68556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.68558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204664.68562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204664.68603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204664.68621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204664.68663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204666.65867: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.se<<< 41175 1727204666.65903: stdout chunk (state=3): >>>rvice", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": <<< 41175 1727204666.65911: stdout chunk (state=3): >>>"systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "<<< 41175 1727204666.65920: stdout chunk (state=3): >>>disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": <<< 41175 1727204666.65926: stdout chunk (state=3): >>>"disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41175 1727204666.67575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204666.67614: stderr chunk (state=3): >>><<< 41175 1727204666.67620: stdout chunk (state=3): >>><<< 41175 1727204666.67648: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204666.68629: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204666.68641: _low_level_execute_command(): starting 41175 1727204666.68647: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204664.5932615-42769-56983881964349/ > /dev/null 2>&1 && sleep 0' 41175 1727204666.69086: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204666.69133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204666.69136: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204666.69139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204666.69141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204666.69143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.69186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204666.69192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204666.69242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204666.71216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204666.71265: stderr chunk (state=3): >>><<< 41175 1727204666.71268: stdout chunk (state=3): >>><<< 41175 1727204666.71291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204666.71300: handler run complete 41175 1727204666.71462: variable 'ansible_facts' from source: unknown 41175 1727204666.71595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204666.72044: variable 'ansible_facts' from source: unknown 41175 1727204666.72168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204666.72370: attempt loop complete, returning result 41175 1727204666.72375: _execute() done 41175 1727204666.72378: dumping result to json 41175 1727204666.72425: done dumping result, returning 41175 1727204666.72434: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-f070-39c4-0000000006e8] 41175 1727204666.72440: sending task result for task 12b410aa-8751-f070-39c4-0000000006e8 41175 1727204666.73356: done sending task result for task 12b410aa-8751-f070-39c4-0000000006e8 41175 1727204666.73359: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204666.73419: no more pending results, returning what we have 41175 1727204666.73423: results queue empty 41175 1727204666.73424: checking for any_errors_fatal 41175 1727204666.73427: done checking for any_errors_fatal 41175 1727204666.73428: checking for max_fail_percentage 41175 1727204666.73429: done checking for max_fail_percentage 41175 1727204666.73430: checking to see if all hosts have failed and the running result is not ok 41175 1727204666.73431: done checking to see if all hosts have failed 41175 1727204666.73431: getting the remaining hosts for this loop 41175 1727204666.73432: done getting the remaining hosts for this loop 41175 1727204666.73435: getting the next task for host managed-node3 41175 1727204666.73439: done getting next task for host managed-node3 41175 1727204666.73442: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204666.73444: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204666.73451: getting variables 41175 1727204666.73453: in VariableManager get_vars() 41175 1727204666.73477: Calling all_inventory to load vars for managed-node3 41175 1727204666.73479: Calling groups_inventory to load vars for managed-node3 41175 1727204666.73480: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204666.73488: Calling all_plugins_play to load vars for managed-node3 41175 1727204666.73493: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204666.73495: Calling groups_plugins_play to load vars for managed-node3 41175 1727204666.74632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204666.76331: done with get_vars() 41175 1727204666.76355: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:26 -0400 (0:00:02.214) 0:00:33.903 ***** 41175 1727204666.76435: entering _queue_task() for managed-node3/package_facts 41175 1727204666.76693: worker is 1 (out of 1 available) 41175 1727204666.76706: exiting _queue_task() for managed-node3/package_facts 41175 1727204666.76719: done queuing things up, now waiting for results queue to drain 41175 1727204666.76721: waiting for pending results... 41175 1727204666.76930: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204666.77031: in run() - task 12b410aa-8751-f070-39c4-0000000006e9 41175 1727204666.77050: variable 'ansible_search_path' from source: unknown 41175 1727204666.77055: variable 'ansible_search_path' from source: unknown 41175 1727204666.77086: calling self._execute() 41175 1727204666.77178: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204666.77186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204666.77197: variable 'omit' from source: magic vars 41175 1727204666.77527: variable 'ansible_distribution_major_version' from source: facts 41175 1727204666.77539: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204666.77546: variable 'omit' from source: magic vars 41175 1727204666.77598: variable 'omit' from source: magic vars 41175 1727204666.77630: variable 'omit' from source: magic vars 41175 1727204666.77665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204666.77700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204666.77718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204666.77737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204666.77749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204666.77777: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204666.77780: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204666.77785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204666.77875: Set connection var ansible_shell_executable to /bin/sh 41175 1727204666.77881: Set connection var ansible_shell_type to sh 41175 1727204666.77886: Set connection var ansible_pipelining to False 41175 1727204666.77897: Set connection var ansible_timeout to 10 41175 1727204666.77907: Set connection var ansible_connection to ssh 41175 1727204666.77913: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204666.77938: variable 'ansible_shell_executable' from source: unknown 41175 1727204666.77941: variable 'ansible_connection' from source: unknown 41175 1727204666.77944: variable 'ansible_module_compression' from source: unknown 41175 1727204666.77949: variable 'ansible_shell_type' from source: unknown 41175 1727204666.77954: variable 'ansible_shell_executable' from source: unknown 41175 1727204666.77958: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204666.77964: variable 'ansible_pipelining' from source: unknown 41175 1727204666.77966: variable 'ansible_timeout' from source: unknown 41175 1727204666.77972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204666.78150: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204666.78159: variable 'omit' from source: magic vars 41175 1727204666.78165: starting attempt loop 41175 1727204666.78169: running the handler 41175 1727204666.78183: _low_level_execute_command(): starting 41175 1727204666.78192: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204666.78742: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204666.78746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204666.78751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.78795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204666.78821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204666.78859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204666.80610: stdout chunk (state=3): >>>/root <<< 41175 1727204666.80723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204666.80782: stderr chunk (state=3): >>><<< 41175 1727204666.80785: stdout chunk (state=3): >>><<< 41175 1727204666.80806: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204666.80822: _low_level_execute_command(): starting 41175 1727204666.80825: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986 `" && echo ansible-tmp-1727204666.8080683-42817-16799907900986="` echo /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986 `" ) && sleep 0' 41175 1727204666.81295: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204666.81298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.81303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204666.81314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204666.81317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.81361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204666.81365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204666.81408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204666.83386: stdout chunk (state=3): >>>ansible-tmp-1727204666.8080683-42817-16799907900986=/root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986 <<< 41175 1727204666.83509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204666.83554: stderr chunk (state=3): >>><<< 41175 1727204666.83557: stdout chunk (state=3): >>><<< 41175 1727204666.83572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204666.8080683-42817-16799907900986=/root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204666.83617: variable 'ansible_module_compression' from source: unknown 41175 1727204666.83656: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41175 1727204666.83714: variable 'ansible_facts' from source: unknown 41175 1727204666.83854: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/AnsiballZ_package_facts.py 41175 1727204666.83978: Sending initial data 41175 1727204666.83981: Sent initial data (161 bytes) 41175 1727204666.84446: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204666.84449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204666.84451: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.84454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204666.84458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.84503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204666.84520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204666.84555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204666.86145: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41175 1727204666.86159: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204666.86177: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204666.86218: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpyf15lhto /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/AnsiballZ_package_facts.py <<< 41175 1727204666.86221: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/AnsiballZ_package_facts.py" <<< 41175 1727204666.86246: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpyf15lhto" to remote "/root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/AnsiballZ_package_facts.py" <<< 41175 1727204666.88299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204666.88360: stderr chunk (state=3): >>><<< 41175 1727204666.88364: stdout chunk (state=3): >>><<< 41175 1727204666.88382: done transferring module to remote 41175 1727204666.88399: _low_level_execute_command(): starting 41175 1727204666.88407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/ /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/AnsiballZ_package_facts.py && sleep 0' 41175 1727204666.88860: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204666.88864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.88866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204666.88869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204666.88874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.88932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204666.88935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204666.88967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204666.90894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204666.90898: stderr chunk (state=3): >>><<< 41175 1727204666.90901: stdout chunk (state=3): >>><<< 41175 1727204666.90908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204666.90920: _low_level_execute_command(): starting 41175 1727204666.90937: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/AnsiballZ_package_facts.py && sleep 0' 41175 1727204666.91546: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204666.91608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204666.91681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204666.91720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204666.91794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204667.55195: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 41175 1727204667.55304: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 41175 1727204667.55325: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 41175 1727204667.55450: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 41175 1727204667.55509: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41175 1727204667.57356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204667.57420: stderr chunk (state=3): >>><<< 41175 1727204667.57424: stdout chunk (state=3): >>><<< 41175 1727204667.57467: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204667.59699: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204667.59784: _low_level_execute_command(): starting 41175 1727204667.59788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204666.8080683-42817-16799907900986/ > /dev/null 2>&1 && sleep 0' 41175 1727204667.60219: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204667.60222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204667.60225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204667.60227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204667.60285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204667.60288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204667.60335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204667.62271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204667.62326: stderr chunk (state=3): >>><<< 41175 1727204667.62330: stdout chunk (state=3): >>><<< 41175 1727204667.62343: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204667.62351: handler run complete 41175 1727204667.63157: variable 'ansible_facts' from source: unknown 41175 1727204667.63614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204667.65574: variable 'ansible_facts' from source: unknown 41175 1727204667.66067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204667.66847: attempt loop complete, returning result 41175 1727204667.66863: _execute() done 41175 1727204667.66867: dumping result to json 41175 1727204667.67044: done dumping result, returning 41175 1727204667.67055: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-f070-39c4-0000000006e9] 41175 1727204667.67062: sending task result for task 12b410aa-8751-f070-39c4-0000000006e9 41175 1727204667.69084: done sending task result for task 12b410aa-8751-f070-39c4-0000000006e9 41175 1727204667.69087: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204667.69192: no more pending results, returning what we have 41175 1727204667.69195: results queue empty 41175 1727204667.69196: checking for any_errors_fatal 41175 1727204667.69201: done checking for any_errors_fatal 41175 1727204667.69201: checking for max_fail_percentage 41175 1727204667.69202: done checking for max_fail_percentage 41175 1727204667.69203: checking to see if all hosts have failed and the running result is not ok 41175 1727204667.69204: done checking to see if all hosts have failed 41175 1727204667.69204: getting the remaining hosts for this loop 41175 1727204667.69205: done getting the remaining hosts for this loop 41175 1727204667.69208: getting the next task for host managed-node3 41175 1727204667.69214: done getting next task for host managed-node3 41175 1727204667.69217: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204667.69219: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204667.69227: getting variables 41175 1727204667.69228: in VariableManager get_vars() 41175 1727204667.69254: Calling all_inventory to load vars for managed-node3 41175 1727204667.69256: Calling groups_inventory to load vars for managed-node3 41175 1727204667.69259: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204667.69269: Calling all_plugins_play to load vars for managed-node3 41175 1727204667.69271: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204667.69273: Calling groups_plugins_play to load vars for managed-node3 41175 1727204667.70494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204667.72122: done with get_vars() 41175 1727204667.72145: done getting variables 41175 1727204667.72202: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.957) 0:00:34.861 ***** 41175 1727204667.72227: entering _queue_task() for managed-node3/debug 41175 1727204667.72487: worker is 1 (out of 1 available) 41175 1727204667.72502: exiting _queue_task() for managed-node3/debug 41175 1727204667.72516: done queuing things up, now waiting for results queue to drain 41175 1727204667.72517: waiting for pending results... 41175 1727204667.72714: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204667.72797: in run() - task 12b410aa-8751-f070-39c4-0000000000b8 41175 1727204667.72811: variable 'ansible_search_path' from source: unknown 41175 1727204667.72815: variable 'ansible_search_path' from source: unknown 41175 1727204667.72852: calling self._execute() 41175 1727204667.72937: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204667.72944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204667.72954: variable 'omit' from source: magic vars 41175 1727204667.73287: variable 'ansible_distribution_major_version' from source: facts 41175 1727204667.73303: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204667.73310: variable 'omit' from source: magic vars 41175 1727204667.73346: variable 'omit' from source: magic vars 41175 1727204667.73435: variable 'network_provider' from source: set_fact 41175 1727204667.73451: variable 'omit' from source: magic vars 41175 1727204667.73486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204667.73520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204667.73543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204667.73560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204667.73571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204667.73601: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204667.73605: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204667.73608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204667.73694: Set connection var ansible_shell_executable to /bin/sh 41175 1727204667.73697: Set connection var ansible_shell_type to sh 41175 1727204667.73705: Set connection var ansible_pipelining to False 41175 1727204667.73715: Set connection var ansible_timeout to 10 41175 1727204667.73724: Set connection var ansible_connection to ssh 41175 1727204667.73734: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204667.73755: variable 'ansible_shell_executable' from source: unknown 41175 1727204667.73759: variable 'ansible_connection' from source: unknown 41175 1727204667.73761: variable 'ansible_module_compression' from source: unknown 41175 1727204667.73765: variable 'ansible_shell_type' from source: unknown 41175 1727204667.73768: variable 'ansible_shell_executable' from source: unknown 41175 1727204667.73773: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204667.73778: variable 'ansible_pipelining' from source: unknown 41175 1727204667.73781: variable 'ansible_timeout' from source: unknown 41175 1727204667.73786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204667.73919: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204667.73928: variable 'omit' from source: magic vars 41175 1727204667.73935: starting attempt loop 41175 1727204667.73938: running the handler 41175 1727204667.73985: handler run complete 41175 1727204667.74001: attempt loop complete, returning result 41175 1727204667.74005: _execute() done 41175 1727204667.74007: dumping result to json 41175 1727204667.74012: done dumping result, returning 41175 1727204667.74022: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-f070-39c4-0000000000b8] 41175 1727204667.74026: sending task result for task 12b410aa-8751-f070-39c4-0000000000b8 41175 1727204667.74115: done sending task result for task 12b410aa-8751-f070-39c4-0000000000b8 41175 1727204667.74121: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 41175 1727204667.74184: no more pending results, returning what we have 41175 1727204667.74188: results queue empty 41175 1727204667.74191: checking for any_errors_fatal 41175 1727204667.74198: done checking for any_errors_fatal 41175 1727204667.74199: checking for max_fail_percentage 41175 1727204667.74201: done checking for max_fail_percentage 41175 1727204667.74202: checking to see if all hosts have failed and the running result is not ok 41175 1727204667.74203: done checking to see if all hosts have failed 41175 1727204667.74204: getting the remaining hosts for this loop 41175 1727204667.74206: done getting the remaining hosts for this loop 41175 1727204667.74211: getting the next task for host managed-node3 41175 1727204667.74219: done getting next task for host managed-node3 41175 1727204667.74223: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204667.74225: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204667.74237: getting variables 41175 1727204667.74239: in VariableManager get_vars() 41175 1727204667.74272: Calling all_inventory to load vars for managed-node3 41175 1727204667.74276: Calling groups_inventory to load vars for managed-node3 41175 1727204667.74278: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204667.74288: Calling all_plugins_play to load vars for managed-node3 41175 1727204667.74300: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204667.74304: Calling groups_plugins_play to load vars for managed-node3 41175 1727204667.75529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204667.77156: done with get_vars() 41175 1727204667.77178: done getting variables 41175 1727204667.77231: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.050) 0:00:34.911 ***** 41175 1727204667.77256: entering _queue_task() for managed-node3/fail 41175 1727204667.77502: worker is 1 (out of 1 available) 41175 1727204667.77519: exiting _queue_task() for managed-node3/fail 41175 1727204667.77531: done queuing things up, now waiting for results queue to drain 41175 1727204667.77533: waiting for pending results... 41175 1727204667.77723: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204667.77796: in run() - task 12b410aa-8751-f070-39c4-0000000000b9 41175 1727204667.77809: variable 'ansible_search_path' from source: unknown 41175 1727204667.77812: variable 'ansible_search_path' from source: unknown 41175 1727204667.77844: calling self._execute() 41175 1727204667.77927: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204667.77933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204667.77944: variable 'omit' from source: magic vars 41175 1727204667.78264: variable 'ansible_distribution_major_version' from source: facts 41175 1727204667.78275: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204667.78383: variable 'network_state' from source: role '' defaults 41175 1727204667.78396: Evaluated conditional (network_state != {}): False 41175 1727204667.78400: when evaluation is False, skipping this task 41175 1727204667.78403: _execute() done 41175 1727204667.78407: dumping result to json 41175 1727204667.78411: done dumping result, returning 41175 1727204667.78427: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-f070-39c4-0000000000b9] 41175 1727204667.78430: sending task result for task 12b410aa-8751-f070-39c4-0000000000b9 41175 1727204667.78525: done sending task result for task 12b410aa-8751-f070-39c4-0000000000b9 41175 1727204667.78528: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204667.78584: no more pending results, returning what we have 41175 1727204667.78588: results queue empty 41175 1727204667.78591: checking for any_errors_fatal 41175 1727204667.78597: done checking for any_errors_fatal 41175 1727204667.78598: checking for max_fail_percentage 41175 1727204667.78600: done checking for max_fail_percentage 41175 1727204667.78601: checking to see if all hosts have failed and the running result is not ok 41175 1727204667.78602: done checking to see if all hosts have failed 41175 1727204667.78603: getting the remaining hosts for this loop 41175 1727204667.78604: done getting the remaining hosts for this loop 41175 1727204667.78609: getting the next task for host managed-node3 41175 1727204667.78614: done getting next task for host managed-node3 41175 1727204667.78621: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204667.78623: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204667.78638: getting variables 41175 1727204667.78641: in VariableManager get_vars() 41175 1727204667.78674: Calling all_inventory to load vars for managed-node3 41175 1727204667.78677: Calling groups_inventory to load vars for managed-node3 41175 1727204667.78679: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204667.78696: Calling all_plugins_play to load vars for managed-node3 41175 1727204667.78700: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204667.78704: Calling groups_plugins_play to load vars for managed-node3 41175 1727204667.80041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204667.81665: done with get_vars() 41175 1727204667.81686: done getting variables 41175 1727204667.81740: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.045) 0:00:34.956 ***** 41175 1727204667.81765: entering _queue_task() for managed-node3/fail 41175 1727204667.82007: worker is 1 (out of 1 available) 41175 1727204667.82024: exiting _queue_task() for managed-node3/fail 41175 1727204667.82037: done queuing things up, now waiting for results queue to drain 41175 1727204667.82039: waiting for pending results... 41175 1727204667.82226: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204667.82302: in run() - task 12b410aa-8751-f070-39c4-0000000000ba 41175 1727204667.82315: variable 'ansible_search_path' from source: unknown 41175 1727204667.82321: variable 'ansible_search_path' from source: unknown 41175 1727204667.82352: calling self._execute() 41175 1727204667.82436: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204667.82442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204667.82453: variable 'omit' from source: magic vars 41175 1727204667.82777: variable 'ansible_distribution_major_version' from source: facts 41175 1727204667.82788: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204667.82902: variable 'network_state' from source: role '' defaults 41175 1727204667.82914: Evaluated conditional (network_state != {}): False 41175 1727204667.82922: when evaluation is False, skipping this task 41175 1727204667.82925: _execute() done 41175 1727204667.82929: dumping result to json 41175 1727204667.82931: done dumping result, returning 41175 1727204667.82941: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-f070-39c4-0000000000ba] 41175 1727204667.82944: sending task result for task 12b410aa-8751-f070-39c4-0000000000ba 41175 1727204667.83041: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ba 41175 1727204667.83044: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204667.83097: no more pending results, returning what we have 41175 1727204667.83101: results queue empty 41175 1727204667.83102: checking for any_errors_fatal 41175 1727204667.83108: done checking for any_errors_fatal 41175 1727204667.83109: checking for max_fail_percentage 41175 1727204667.83111: done checking for max_fail_percentage 41175 1727204667.83112: checking to see if all hosts have failed and the running result is not ok 41175 1727204667.83113: done checking to see if all hosts have failed 41175 1727204667.83114: getting the remaining hosts for this loop 41175 1727204667.83119: done getting the remaining hosts for this loop 41175 1727204667.83123: getting the next task for host managed-node3 41175 1727204667.83128: done getting next task for host managed-node3 41175 1727204667.83132: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204667.83134: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204667.83151: getting variables 41175 1727204667.83153: in VariableManager get_vars() 41175 1727204667.83185: Calling all_inventory to load vars for managed-node3 41175 1727204667.83188: Calling groups_inventory to load vars for managed-node3 41175 1727204667.83192: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204667.83203: Calling all_plugins_play to load vars for managed-node3 41175 1727204667.83206: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204667.83210: Calling groups_plugins_play to load vars for managed-node3 41175 1727204667.84531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204667.86165: done with get_vars() 41175 1727204667.86193: done getting variables 41175 1727204667.86244: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.045) 0:00:35.001 ***** 41175 1727204667.86271: entering _queue_task() for managed-node3/fail 41175 1727204667.86532: worker is 1 (out of 1 available) 41175 1727204667.86546: exiting _queue_task() for managed-node3/fail 41175 1727204667.86558: done queuing things up, now waiting for results queue to drain 41175 1727204667.86560: waiting for pending results... 41175 1727204667.86758: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204667.86841: in run() - task 12b410aa-8751-f070-39c4-0000000000bb 41175 1727204667.86854: variable 'ansible_search_path' from source: unknown 41175 1727204667.86858: variable 'ansible_search_path' from source: unknown 41175 1727204667.86891: calling self._execute() 41175 1727204667.86973: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204667.86980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204667.86992: variable 'omit' from source: magic vars 41175 1727204667.87321: variable 'ansible_distribution_major_version' from source: facts 41175 1727204667.87331: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204667.87485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204667.89263: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204667.89323: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204667.89351: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204667.89385: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204667.89412: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204667.89484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204667.89512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204667.89537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204667.89572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204667.89585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204667.89678: variable 'ansible_distribution_major_version' from source: facts 41175 1727204667.89693: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41175 1727204667.89794: variable 'ansible_distribution' from source: facts 41175 1727204667.89798: variable '__network_rh_distros' from source: role '' defaults 41175 1727204667.89809: Evaluated conditional (ansible_distribution in __network_rh_distros): False 41175 1727204667.89812: when evaluation is False, skipping this task 41175 1727204667.89819: _execute() done 41175 1727204667.89822: dumping result to json 41175 1727204667.89825: done dumping result, returning 41175 1727204667.89834: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-f070-39c4-0000000000bb] 41175 1727204667.89841: sending task result for task 12b410aa-8751-f070-39c4-0000000000bb 41175 1727204667.89939: done sending task result for task 12b410aa-8751-f070-39c4-0000000000bb 41175 1727204667.89942: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 41175 1727204667.90003: no more pending results, returning what we have 41175 1727204667.90008: results queue empty 41175 1727204667.90009: checking for any_errors_fatal 41175 1727204667.90015: done checking for any_errors_fatal 41175 1727204667.90018: checking for max_fail_percentage 41175 1727204667.90020: done checking for max_fail_percentage 41175 1727204667.90021: checking to see if all hosts have failed and the running result is not ok 41175 1727204667.90022: done checking to see if all hosts have failed 41175 1727204667.90023: getting the remaining hosts for this loop 41175 1727204667.90025: done getting the remaining hosts for this loop 41175 1727204667.90029: getting the next task for host managed-node3 41175 1727204667.90036: done getting next task for host managed-node3 41175 1727204667.90040: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204667.90042: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204667.90059: getting variables 41175 1727204667.90061: in VariableManager get_vars() 41175 1727204667.90101: Calling all_inventory to load vars for managed-node3 41175 1727204667.90104: Calling groups_inventory to load vars for managed-node3 41175 1727204667.90106: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204667.90119: Calling all_plugins_play to load vars for managed-node3 41175 1727204667.90123: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204667.90127: Calling groups_plugins_play to load vars for managed-node3 41175 1727204667.91421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204667.93070: done with get_vars() 41175 1727204667.93096: done getting variables 41175 1727204667.93155: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.069) 0:00:35.070 ***** 41175 1727204667.93180: entering _queue_task() for managed-node3/dnf 41175 1727204667.93449: worker is 1 (out of 1 available) 41175 1727204667.93464: exiting _queue_task() for managed-node3/dnf 41175 1727204667.93476: done queuing things up, now waiting for results queue to drain 41175 1727204667.93478: waiting for pending results... 41175 1727204667.93684: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204667.93768: in run() - task 12b410aa-8751-f070-39c4-0000000000bc 41175 1727204667.93780: variable 'ansible_search_path' from source: unknown 41175 1727204667.93784: variable 'ansible_search_path' from source: unknown 41175 1727204667.93826: calling self._execute() 41175 1727204667.93907: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204667.93913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204667.93928: variable 'omit' from source: magic vars 41175 1727204667.94256: variable 'ansible_distribution_major_version' from source: facts 41175 1727204667.94269: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204667.94451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204667.96552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204667.96603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204667.96637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204667.96695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204667.96698: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204667.96762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204667.96796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204667.96820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204667.96851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204667.96863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204667.96962: variable 'ansible_distribution' from source: facts 41175 1727204667.96966: variable 'ansible_distribution_major_version' from source: facts 41175 1727204667.96975: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41175 1727204667.97069: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204667.97183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204667.97210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204667.97230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204667.97261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204667.97274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204667.97313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204667.97336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204667.97356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204667.97386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204667.97400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204667.97438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204667.97458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204667.97478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204667.97509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204667.97524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204667.97655: variable 'network_connections' from source: play vars 41175 1727204667.97667: variable 'profile' from source: play vars 41175 1727204667.97727: variable 'profile' from source: play vars 41175 1727204667.97731: variable 'interface' from source: set_fact 41175 1727204667.97783: variable 'interface' from source: set_fact 41175 1727204667.97843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204667.97994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204667.98027: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204667.98054: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204667.98080: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204667.98122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204667.98141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204667.98165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204667.98191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204667.98235: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204667.98433: variable 'network_connections' from source: play vars 41175 1727204667.98438: variable 'profile' from source: play vars 41175 1727204667.98488: variable 'profile' from source: play vars 41175 1727204667.98494: variable 'interface' from source: set_fact 41175 1727204667.98547: variable 'interface' from source: set_fact 41175 1727204667.98568: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204667.98571: when evaluation is False, skipping this task 41175 1727204667.98574: _execute() done 41175 1727204667.98579: dumping result to json 41175 1727204667.98583: done dumping result, returning 41175 1727204667.98593: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000bc] 41175 1727204667.98599: sending task result for task 12b410aa-8751-f070-39c4-0000000000bc 41175 1727204667.98693: done sending task result for task 12b410aa-8751-f070-39c4-0000000000bc 41175 1727204667.98696: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204667.98774: no more pending results, returning what we have 41175 1727204667.98778: results queue empty 41175 1727204667.98779: checking for any_errors_fatal 41175 1727204667.98786: done checking for any_errors_fatal 41175 1727204667.98787: checking for max_fail_percentage 41175 1727204667.98792: done checking for max_fail_percentage 41175 1727204667.98793: checking to see if all hosts have failed and the running result is not ok 41175 1727204667.98794: done checking to see if all hosts have failed 41175 1727204667.98795: getting the remaining hosts for this loop 41175 1727204667.98797: done getting the remaining hosts for this loop 41175 1727204667.98802: getting the next task for host managed-node3 41175 1727204667.98809: done getting next task for host managed-node3 41175 1727204667.98814: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204667.98818: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204667.98833: getting variables 41175 1727204667.98835: in VariableManager get_vars() 41175 1727204667.98872: Calling all_inventory to load vars for managed-node3 41175 1727204667.98875: Calling groups_inventory to load vars for managed-node3 41175 1727204667.98878: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204667.98896: Calling all_plugins_play to load vars for managed-node3 41175 1727204667.98900: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204667.98904: Calling groups_plugins_play to load vars for managed-node3 41175 1727204668.00308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204668.01932: done with get_vars() 41175 1727204668.01963: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204668.02029: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.088) 0:00:35.159 ***** 41175 1727204668.02056: entering _queue_task() for managed-node3/yum 41175 1727204668.02325: worker is 1 (out of 1 available) 41175 1727204668.02340: exiting _queue_task() for managed-node3/yum 41175 1727204668.02353: done queuing things up, now waiting for results queue to drain 41175 1727204668.02355: waiting for pending results... 41175 1727204668.02562: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204668.02646: in run() - task 12b410aa-8751-f070-39c4-0000000000bd 41175 1727204668.02658: variable 'ansible_search_path' from source: unknown 41175 1727204668.02662: variable 'ansible_search_path' from source: unknown 41175 1727204668.02698: calling self._execute() 41175 1727204668.02783: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.02788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.02802: variable 'omit' from source: magic vars 41175 1727204668.03147: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.03159: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204668.03315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204668.05141: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204668.05201: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204668.05236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204668.05265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204668.05288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204668.05364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.05391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.05415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.05453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.05466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.05553: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.05567: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41175 1727204668.05571: when evaluation is False, skipping this task 41175 1727204668.05576: _execute() done 41175 1727204668.05580: dumping result to json 41175 1727204668.05586: done dumping result, returning 41175 1727204668.05595: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000bd] 41175 1727204668.05601: sending task result for task 12b410aa-8751-f070-39c4-0000000000bd 41175 1727204668.05700: done sending task result for task 12b410aa-8751-f070-39c4-0000000000bd 41175 1727204668.05703: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41175 1727204668.05781: no more pending results, returning what we have 41175 1727204668.05785: results queue empty 41175 1727204668.05786: checking for any_errors_fatal 41175 1727204668.05797: done checking for any_errors_fatal 41175 1727204668.05798: checking for max_fail_percentage 41175 1727204668.05800: done checking for max_fail_percentage 41175 1727204668.05801: checking to see if all hosts have failed and the running result is not ok 41175 1727204668.05802: done checking to see if all hosts have failed 41175 1727204668.05803: getting the remaining hosts for this loop 41175 1727204668.05805: done getting the remaining hosts for this loop 41175 1727204668.05809: getting the next task for host managed-node3 41175 1727204668.05817: done getting next task for host managed-node3 41175 1727204668.05821: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204668.05823: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204668.05839: getting variables 41175 1727204668.05840: in VariableManager get_vars() 41175 1727204668.05879: Calling all_inventory to load vars for managed-node3 41175 1727204668.05882: Calling groups_inventory to load vars for managed-node3 41175 1727204668.05885: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204668.05903: Calling all_plugins_play to load vars for managed-node3 41175 1727204668.05907: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204668.05911: Calling groups_plugins_play to load vars for managed-node3 41175 1727204668.07300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204668.08929: done with get_vars() 41175 1727204668.08956: done getting variables 41175 1727204668.09007: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.069) 0:00:35.229 ***** 41175 1727204668.09034: entering _queue_task() for managed-node3/fail 41175 1727204668.09302: worker is 1 (out of 1 available) 41175 1727204668.09316: exiting _queue_task() for managed-node3/fail 41175 1727204668.09327: done queuing things up, now waiting for results queue to drain 41175 1727204668.09329: waiting for pending results... 41175 1727204668.09545: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204668.09639: in run() - task 12b410aa-8751-f070-39c4-0000000000be 41175 1727204668.09651: variable 'ansible_search_path' from source: unknown 41175 1727204668.09655: variable 'ansible_search_path' from source: unknown 41175 1727204668.09693: calling self._execute() 41175 1727204668.09781: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.09790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.09801: variable 'omit' from source: magic vars 41175 1727204668.10132: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.10143: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204668.10250: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204668.10427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204668.12185: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204668.12239: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204668.12273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204668.12306: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204668.12329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204668.12401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.12428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.12449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.12486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.12502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.12543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.12564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.12590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.12625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.12638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.12672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.12702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.12723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.12755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.12767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.12926: variable 'network_connections' from source: play vars 41175 1727204668.12936: variable 'profile' from source: play vars 41175 1727204668.12994: variable 'profile' from source: play vars 41175 1727204668.13002: variable 'interface' from source: set_fact 41175 1727204668.13056: variable 'interface' from source: set_fact 41175 1727204668.13122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204668.13266: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204668.13300: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204668.13327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204668.13356: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204668.13396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204668.13414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204668.13436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.13461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204668.13505: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204668.13712: variable 'network_connections' from source: play vars 41175 1727204668.13719: variable 'profile' from source: play vars 41175 1727204668.13770: variable 'profile' from source: play vars 41175 1727204668.13773: variable 'interface' from source: set_fact 41175 1727204668.13826: variable 'interface' from source: set_fact 41175 1727204668.13847: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204668.13851: when evaluation is False, skipping this task 41175 1727204668.13853: _execute() done 41175 1727204668.13858: dumping result to json 41175 1727204668.13863: done dumping result, returning 41175 1727204668.13871: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000be] 41175 1727204668.13882: sending task result for task 12b410aa-8751-f070-39c4-0000000000be 41175 1727204668.13976: done sending task result for task 12b410aa-8751-f070-39c4-0000000000be 41175 1727204668.13979: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204668.14060: no more pending results, returning what we have 41175 1727204668.14064: results queue empty 41175 1727204668.14065: checking for any_errors_fatal 41175 1727204668.14071: done checking for any_errors_fatal 41175 1727204668.14072: checking for max_fail_percentage 41175 1727204668.14074: done checking for max_fail_percentage 41175 1727204668.14075: checking to see if all hosts have failed and the running result is not ok 41175 1727204668.14076: done checking to see if all hosts have failed 41175 1727204668.14077: getting the remaining hosts for this loop 41175 1727204668.14079: done getting the remaining hosts for this loop 41175 1727204668.14083: getting the next task for host managed-node3 41175 1727204668.14091: done getting next task for host managed-node3 41175 1727204668.14095: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41175 1727204668.14097: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204668.14115: getting variables 41175 1727204668.14118: in VariableManager get_vars() 41175 1727204668.14155: Calling all_inventory to load vars for managed-node3 41175 1727204668.14158: Calling groups_inventory to load vars for managed-node3 41175 1727204668.14160: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204668.14170: Calling all_plugins_play to load vars for managed-node3 41175 1727204668.14173: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204668.14176: Calling groups_plugins_play to load vars for managed-node3 41175 1727204668.15445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204668.17061: done with get_vars() 41175 1727204668.17085: done getting variables 41175 1727204668.17140: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.081) 0:00:35.310 ***** 41175 1727204668.17167: entering _queue_task() for managed-node3/package 41175 1727204668.17422: worker is 1 (out of 1 available) 41175 1727204668.17437: exiting _queue_task() for managed-node3/package 41175 1727204668.17449: done queuing things up, now waiting for results queue to drain 41175 1727204668.17450: waiting for pending results... 41175 1727204668.17639: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41175 1727204668.17720: in run() - task 12b410aa-8751-f070-39c4-0000000000bf 41175 1727204668.17732: variable 'ansible_search_path' from source: unknown 41175 1727204668.17735: variable 'ansible_search_path' from source: unknown 41175 1727204668.17768: calling self._execute() 41175 1727204668.17849: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.17856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.17865: variable 'omit' from source: magic vars 41175 1727204668.18188: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.18201: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204668.18372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204668.18588: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204668.18628: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204668.18658: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204668.18692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204668.18782: variable 'network_packages' from source: role '' defaults 41175 1727204668.18873: variable '__network_provider_setup' from source: role '' defaults 41175 1727204668.18885: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204668.18949: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204668.18957: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204668.19042: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204668.19201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204668.25373: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204668.25437: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204668.25470: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204668.25498: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204668.25544: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204668.25585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.25611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.25635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.25673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.25686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.25726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.25746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.25769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.25803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.25819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.26002: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204668.26100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.26124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.26144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.26174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.26187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.26266: variable 'ansible_python' from source: facts 41175 1727204668.26287: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204668.26363: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204668.26433: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204668.26540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.26563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.26583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.26616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.26631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.26674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.26698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.26721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.26754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.26795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.26887: variable 'network_connections' from source: play vars 41175 1727204668.26965: variable 'profile' from source: play vars 41175 1727204668.26979: variable 'profile' from source: play vars 41175 1727204668.26986: variable 'interface' from source: set_fact 41175 1727204668.27046: variable 'interface' from source: set_fact 41175 1727204668.27108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204668.27133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204668.27158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.27184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204668.27221: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204668.27450: variable 'network_connections' from source: play vars 41175 1727204668.27453: variable 'profile' from source: play vars 41175 1727204668.27540: variable 'profile' from source: play vars 41175 1727204668.27547: variable 'interface' from source: set_fact 41175 1727204668.27606: variable 'interface' from source: set_fact 41175 1727204668.27638: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204668.27703: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204668.27971: variable 'network_connections' from source: play vars 41175 1727204668.27979: variable 'profile' from source: play vars 41175 1727204668.28038: variable 'profile' from source: play vars 41175 1727204668.28041: variable 'interface' from source: set_fact 41175 1727204668.28134: variable 'interface' from source: set_fact 41175 1727204668.28159: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204668.28246: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204668.28501: variable 'network_connections' from source: play vars 41175 1727204668.28504: variable 'profile' from source: play vars 41175 1727204668.28560: variable 'profile' from source: play vars 41175 1727204668.28564: variable 'interface' from source: set_fact 41175 1727204668.28649: variable 'interface' from source: set_fact 41175 1727204668.28693: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204668.28747: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204668.28755: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204668.28805: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204668.28987: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204668.29381: variable 'network_connections' from source: play vars 41175 1727204668.29384: variable 'profile' from source: play vars 41175 1727204668.29437: variable 'profile' from source: play vars 41175 1727204668.29441: variable 'interface' from source: set_fact 41175 1727204668.29497: variable 'interface' from source: set_fact 41175 1727204668.29506: variable 'ansible_distribution' from source: facts 41175 1727204668.29510: variable '__network_rh_distros' from source: role '' defaults 41175 1727204668.29519: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.29530: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204668.29669: variable 'ansible_distribution' from source: facts 41175 1727204668.29673: variable '__network_rh_distros' from source: role '' defaults 41175 1727204668.29679: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.29686: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204668.29826: variable 'ansible_distribution' from source: facts 41175 1727204668.29830: variable '__network_rh_distros' from source: role '' defaults 41175 1727204668.29836: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.29864: variable 'network_provider' from source: set_fact 41175 1727204668.29878: variable 'ansible_facts' from source: unknown 41175 1727204668.30482: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41175 1727204668.30486: when evaluation is False, skipping this task 41175 1727204668.30490: _execute() done 41175 1727204668.30493: dumping result to json 41175 1727204668.30496: done dumping result, returning 41175 1727204668.30505: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-f070-39c4-0000000000bf] 41175 1727204668.30508: sending task result for task 12b410aa-8751-f070-39c4-0000000000bf 41175 1727204668.30602: done sending task result for task 12b410aa-8751-f070-39c4-0000000000bf 41175 1727204668.30605: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41175 1727204668.30659: no more pending results, returning what we have 41175 1727204668.30662: results queue empty 41175 1727204668.30663: checking for any_errors_fatal 41175 1727204668.30672: done checking for any_errors_fatal 41175 1727204668.30673: checking for max_fail_percentage 41175 1727204668.30675: done checking for max_fail_percentage 41175 1727204668.30676: checking to see if all hosts have failed and the running result is not ok 41175 1727204668.30677: done checking to see if all hosts have failed 41175 1727204668.30677: getting the remaining hosts for this loop 41175 1727204668.30679: done getting the remaining hosts for this loop 41175 1727204668.30683: getting the next task for host managed-node3 41175 1727204668.30692: done getting next task for host managed-node3 41175 1727204668.30696: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204668.30699: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204668.30713: getting variables 41175 1727204668.30719: in VariableManager get_vars() 41175 1727204668.30761: Calling all_inventory to load vars for managed-node3 41175 1727204668.30764: Calling groups_inventory to load vars for managed-node3 41175 1727204668.30766: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204668.30783: Calling all_plugins_play to load vars for managed-node3 41175 1727204668.30786: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204668.30798: Calling groups_plugins_play to load vars for managed-node3 41175 1727204668.35873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204668.37480: done with get_vars() 41175 1727204668.37506: done getting variables 41175 1727204668.37552: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.204) 0:00:35.514 ***** 41175 1727204668.37573: entering _queue_task() for managed-node3/package 41175 1727204668.37847: worker is 1 (out of 1 available) 41175 1727204668.37862: exiting _queue_task() for managed-node3/package 41175 1727204668.37874: done queuing things up, now waiting for results queue to drain 41175 1727204668.37876: waiting for pending results... 41175 1727204668.38092: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204668.38181: in run() - task 12b410aa-8751-f070-39c4-0000000000c0 41175 1727204668.38196: variable 'ansible_search_path' from source: unknown 41175 1727204668.38199: variable 'ansible_search_path' from source: unknown 41175 1727204668.38237: calling self._execute() 41175 1727204668.38326: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.38333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.38394: variable 'omit' from source: magic vars 41175 1727204668.38680: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.38693: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204668.38805: variable 'network_state' from source: role '' defaults 41175 1727204668.38815: Evaluated conditional (network_state != {}): False 41175 1727204668.38819: when evaluation is False, skipping this task 41175 1727204668.38825: _execute() done 41175 1727204668.38829: dumping result to json 41175 1727204668.38834: done dumping result, returning 41175 1727204668.38842: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-f070-39c4-0000000000c0] 41175 1727204668.38849: sending task result for task 12b410aa-8751-f070-39c4-0000000000c0 41175 1727204668.38948: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c0 41175 1727204668.38951: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204668.39023: no more pending results, returning what we have 41175 1727204668.39027: results queue empty 41175 1727204668.39028: checking for any_errors_fatal 41175 1727204668.39036: done checking for any_errors_fatal 41175 1727204668.39037: checking for max_fail_percentage 41175 1727204668.39040: done checking for max_fail_percentage 41175 1727204668.39041: checking to see if all hosts have failed and the running result is not ok 41175 1727204668.39042: done checking to see if all hosts have failed 41175 1727204668.39043: getting the remaining hosts for this loop 41175 1727204668.39045: done getting the remaining hosts for this loop 41175 1727204668.39049: getting the next task for host managed-node3 41175 1727204668.39055: done getting next task for host managed-node3 41175 1727204668.39059: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204668.39063: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204668.39077: getting variables 41175 1727204668.39079: in VariableManager get_vars() 41175 1727204668.39114: Calling all_inventory to load vars for managed-node3 41175 1727204668.39118: Calling groups_inventory to load vars for managed-node3 41175 1727204668.39120: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204668.39131: Calling all_plugins_play to load vars for managed-node3 41175 1727204668.39134: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204668.39137: Calling groups_plugins_play to load vars for managed-node3 41175 1727204668.40338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204668.42049: done with get_vars() 41175 1727204668.42072: done getting variables 41175 1727204668.42125: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.045) 0:00:35.560 ***** 41175 1727204668.42148: entering _queue_task() for managed-node3/package 41175 1727204668.42386: worker is 1 (out of 1 available) 41175 1727204668.42404: exiting _queue_task() for managed-node3/package 41175 1727204668.42416: done queuing things up, now waiting for results queue to drain 41175 1727204668.42418: waiting for pending results... 41175 1727204668.42616: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204668.42697: in run() - task 12b410aa-8751-f070-39c4-0000000000c1 41175 1727204668.42710: variable 'ansible_search_path' from source: unknown 41175 1727204668.42713: variable 'ansible_search_path' from source: unknown 41175 1727204668.42752: calling self._execute() 41175 1727204668.42838: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.42846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.42857: variable 'omit' from source: magic vars 41175 1727204668.43193: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.43209: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204668.43315: variable 'network_state' from source: role '' defaults 41175 1727204668.43330: Evaluated conditional (network_state != {}): False 41175 1727204668.43334: when evaluation is False, skipping this task 41175 1727204668.43337: _execute() done 41175 1727204668.43339: dumping result to json 41175 1727204668.43345: done dumping result, returning 41175 1727204668.43353: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-f070-39c4-0000000000c1] 41175 1727204668.43360: sending task result for task 12b410aa-8751-f070-39c4-0000000000c1 41175 1727204668.43464: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c1 41175 1727204668.43468: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204668.43526: no more pending results, returning what we have 41175 1727204668.43531: results queue empty 41175 1727204668.43532: checking for any_errors_fatal 41175 1727204668.43540: done checking for any_errors_fatal 41175 1727204668.43541: checking for max_fail_percentage 41175 1727204668.43543: done checking for max_fail_percentage 41175 1727204668.43544: checking to see if all hosts have failed and the running result is not ok 41175 1727204668.43545: done checking to see if all hosts have failed 41175 1727204668.43546: getting the remaining hosts for this loop 41175 1727204668.43548: done getting the remaining hosts for this loop 41175 1727204668.43553: getting the next task for host managed-node3 41175 1727204668.43559: done getting next task for host managed-node3 41175 1727204668.43563: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204668.43566: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204668.43581: getting variables 41175 1727204668.43583: in VariableManager get_vars() 41175 1727204668.43619: Calling all_inventory to load vars for managed-node3 41175 1727204668.43622: Calling groups_inventory to load vars for managed-node3 41175 1727204668.43624: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204668.43635: Calling all_plugins_play to load vars for managed-node3 41175 1727204668.43638: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204668.43641: Calling groups_plugins_play to load vars for managed-node3 41175 1727204668.44892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204668.46531: done with get_vars() 41175 1727204668.46554: done getting variables 41175 1727204668.46604: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.044) 0:00:35.605 ***** 41175 1727204668.46633: entering _queue_task() for managed-node3/service 41175 1727204668.46870: worker is 1 (out of 1 available) 41175 1727204668.46886: exiting _queue_task() for managed-node3/service 41175 1727204668.46901: done queuing things up, now waiting for results queue to drain 41175 1727204668.46903: waiting for pending results... 41175 1727204668.47100: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204668.47180: in run() - task 12b410aa-8751-f070-39c4-0000000000c2 41175 1727204668.47194: variable 'ansible_search_path' from source: unknown 41175 1727204668.47198: variable 'ansible_search_path' from source: unknown 41175 1727204668.47232: calling self._execute() 41175 1727204668.47322: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.47326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.47335: variable 'omit' from source: magic vars 41175 1727204668.47656: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.47667: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204668.47773: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204668.47949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204668.49957: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204668.50015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204668.50058: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204668.50091: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204668.50116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204668.50184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.50214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.50237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.50269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.50282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.50330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.50351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.50372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.50407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.50495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.50499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.50502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.50505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.50528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.50538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.50678: variable 'network_connections' from source: play vars 41175 1727204668.50691: variable 'profile' from source: play vars 41175 1727204668.50755: variable 'profile' from source: play vars 41175 1727204668.50759: variable 'interface' from source: set_fact 41175 1727204668.50811: variable 'interface' from source: set_fact 41175 1727204668.50876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204668.51006: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204668.51039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204668.51068: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204668.51104: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204668.51139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204668.51159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204668.51182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.51212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204668.51255: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204668.51464: variable 'network_connections' from source: play vars 41175 1727204668.51468: variable 'profile' from source: play vars 41175 1727204668.51526: variable 'profile' from source: play vars 41175 1727204668.51529: variable 'interface' from source: set_fact 41175 1727204668.51579: variable 'interface' from source: set_fact 41175 1727204668.51603: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204668.51606: when evaluation is False, skipping this task 41175 1727204668.51609: _execute() done 41175 1727204668.51612: dumping result to json 41175 1727204668.51618: done dumping result, returning 41175 1727204668.51633: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000c2] 41175 1727204668.51643: sending task result for task 12b410aa-8751-f070-39c4-0000000000c2 41175 1727204668.51731: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c2 41175 1727204668.51735: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204668.51786: no more pending results, returning what we have 41175 1727204668.51792: results queue empty 41175 1727204668.51793: checking for any_errors_fatal 41175 1727204668.51800: done checking for any_errors_fatal 41175 1727204668.51801: checking for max_fail_percentage 41175 1727204668.51802: done checking for max_fail_percentage 41175 1727204668.51803: checking to see if all hosts have failed and the running result is not ok 41175 1727204668.51804: done checking to see if all hosts have failed 41175 1727204668.51805: getting the remaining hosts for this loop 41175 1727204668.51807: done getting the remaining hosts for this loop 41175 1727204668.51812: getting the next task for host managed-node3 41175 1727204668.51818: done getting next task for host managed-node3 41175 1727204668.51822: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204668.51824: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204668.51838: getting variables 41175 1727204668.51840: in VariableManager get_vars() 41175 1727204668.51881: Calling all_inventory to load vars for managed-node3 41175 1727204668.51885: Calling groups_inventory to load vars for managed-node3 41175 1727204668.51887: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204668.51900: Calling all_plugins_play to load vars for managed-node3 41175 1727204668.51903: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204668.51906: Calling groups_plugins_play to load vars for managed-node3 41175 1727204668.53313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204668.54915: done with get_vars() 41175 1727204668.54943: done getting variables 41175 1727204668.54995: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.083) 0:00:35.689 ***** 41175 1727204668.55019: entering _queue_task() for managed-node3/service 41175 1727204668.55280: worker is 1 (out of 1 available) 41175 1727204668.55297: exiting _queue_task() for managed-node3/service 41175 1727204668.55311: done queuing things up, now waiting for results queue to drain 41175 1727204668.55313: waiting for pending results... 41175 1727204668.55510: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204668.55593: in run() - task 12b410aa-8751-f070-39c4-0000000000c3 41175 1727204668.55607: variable 'ansible_search_path' from source: unknown 41175 1727204668.55610: variable 'ansible_search_path' from source: unknown 41175 1727204668.55645: calling self._execute() 41175 1727204668.55732: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.55739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.55748: variable 'omit' from source: magic vars 41175 1727204668.56309: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.56313: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204668.56383: variable 'network_provider' from source: set_fact 41175 1727204668.56387: variable 'network_state' from source: role '' defaults 41175 1727204668.56401: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41175 1727204668.56411: variable 'omit' from source: magic vars 41175 1727204668.56464: variable 'omit' from source: magic vars 41175 1727204668.56503: variable 'network_service_name' from source: role '' defaults 41175 1727204668.56598: variable 'network_service_name' from source: role '' defaults 41175 1727204668.56742: variable '__network_provider_setup' from source: role '' defaults 41175 1727204668.56746: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204668.56831: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204668.56842: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204668.56923: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204668.57233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204668.59947: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204668.59977: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204668.60033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204668.60099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204668.60137: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204668.60254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.60317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.60359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.60436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.60462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.60545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.60583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.60695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.60699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.60731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.61007: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204668.61109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.61131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.61160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.61188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.61203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.61281: variable 'ansible_python' from source: facts 41175 1727204668.61303: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204668.61377: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204668.61440: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204668.61548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.61568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.61595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.61626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.61639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.61679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204668.61707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204668.61729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.61759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204668.61771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204668.61888: variable 'network_connections' from source: play vars 41175 1727204668.61896: variable 'profile' from source: play vars 41175 1727204668.61962: variable 'profile' from source: play vars 41175 1727204668.61966: variable 'interface' from source: set_fact 41175 1727204668.62022: variable 'interface' from source: set_fact 41175 1727204668.62108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204668.62273: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204668.62319: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204668.62360: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204668.62393: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204668.62447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204668.62477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204668.62505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204668.62534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204668.62579: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204668.62810: variable 'network_connections' from source: play vars 41175 1727204668.62819: variable 'profile' from source: play vars 41175 1727204668.62878: variable 'profile' from source: play vars 41175 1727204668.62882: variable 'interface' from source: set_fact 41175 1727204668.62939: variable 'interface' from source: set_fact 41175 1727204668.62967: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204668.63037: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204668.63281: variable 'network_connections' from source: play vars 41175 1727204668.63286: variable 'profile' from source: play vars 41175 1727204668.63349: variable 'profile' from source: play vars 41175 1727204668.63353: variable 'interface' from source: set_fact 41175 1727204668.63415: variable 'interface' from source: set_fact 41175 1727204668.63439: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204668.63509: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204668.63754: variable 'network_connections' from source: play vars 41175 1727204668.63758: variable 'profile' from source: play vars 41175 1727204668.63823: variable 'profile' from source: play vars 41175 1727204668.63826: variable 'interface' from source: set_fact 41175 1727204668.63889: variable 'interface' from source: set_fact 41175 1727204668.63937: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204668.63987: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204668.63995: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204668.64053: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204668.64247: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204668.64691: variable 'network_connections' from source: play vars 41175 1727204668.64697: variable 'profile' from source: play vars 41175 1727204668.64751: variable 'profile' from source: play vars 41175 1727204668.64756: variable 'interface' from source: set_fact 41175 1727204668.64817: variable 'interface' from source: set_fact 41175 1727204668.64828: variable 'ansible_distribution' from source: facts 41175 1727204668.64832: variable '__network_rh_distros' from source: role '' defaults 41175 1727204668.64840: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.64852: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204668.65003: variable 'ansible_distribution' from source: facts 41175 1727204668.65006: variable '__network_rh_distros' from source: role '' defaults 41175 1727204668.65013: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.65022: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204668.65167: variable 'ansible_distribution' from source: facts 41175 1727204668.65171: variable '__network_rh_distros' from source: role '' defaults 41175 1727204668.65177: variable 'ansible_distribution_major_version' from source: facts 41175 1727204668.65212: variable 'network_provider' from source: set_fact 41175 1727204668.65235: variable 'omit' from source: magic vars 41175 1727204668.65262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204668.65285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204668.65307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204668.65326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204668.65336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204668.65363: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204668.65366: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.65371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.65462: Set connection var ansible_shell_executable to /bin/sh 41175 1727204668.65466: Set connection var ansible_shell_type to sh 41175 1727204668.65472: Set connection var ansible_pipelining to False 41175 1727204668.65481: Set connection var ansible_timeout to 10 41175 1727204668.65487: Set connection var ansible_connection to ssh 41175 1727204668.65495: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204668.65516: variable 'ansible_shell_executable' from source: unknown 41175 1727204668.65525: variable 'ansible_connection' from source: unknown 41175 1727204668.65530: variable 'ansible_module_compression' from source: unknown 41175 1727204668.65532: variable 'ansible_shell_type' from source: unknown 41175 1727204668.65535: variable 'ansible_shell_executable' from source: unknown 41175 1727204668.65537: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204668.65544: variable 'ansible_pipelining' from source: unknown 41175 1727204668.65546: variable 'ansible_timeout' from source: unknown 41175 1727204668.65555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204668.65639: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204668.65651: variable 'omit' from source: magic vars 41175 1727204668.65659: starting attempt loop 41175 1727204668.65662: running the handler 41175 1727204668.65730: variable 'ansible_facts' from source: unknown 41175 1727204668.66572: _low_level_execute_command(): starting 41175 1727204668.66579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204668.67128: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204668.67133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.67136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204668.67139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.67187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204668.67206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204668.67249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204668.69019: stdout chunk (state=3): >>>/root <<< 41175 1727204668.69130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204668.69186: stderr chunk (state=3): >>><<< 41175 1727204668.69191: stdout chunk (state=3): >>><<< 41175 1727204668.69212: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204668.69227: _low_level_execute_command(): starting 41175 1727204668.69235: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455 `" && echo ansible-tmp-1727204668.6921225-42853-186285732526455="` echo /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455 `" ) && sleep 0' 41175 1727204668.69708: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204668.69712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.69714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204668.69719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204668.69722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.69768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204668.69771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204668.69818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204668.71810: stdout chunk (state=3): >>>ansible-tmp-1727204668.6921225-42853-186285732526455=/root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455 <<< 41175 1727204668.71928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204668.71983: stderr chunk (state=3): >>><<< 41175 1727204668.71986: stdout chunk (state=3): >>><<< 41175 1727204668.72006: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204668.6921225-42853-186285732526455=/root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204668.72037: variable 'ansible_module_compression' from source: unknown 41175 1727204668.72078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41175 1727204668.72137: variable 'ansible_facts' from source: unknown 41175 1727204668.72275: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/AnsiballZ_systemd.py 41175 1727204668.72403: Sending initial data 41175 1727204668.72407: Sent initial data (156 bytes) 41175 1727204668.72903: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204668.72906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204668.72910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204668.72912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.72968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204668.72971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204668.73014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204668.74640: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41175 1727204668.74644: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204668.74673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204668.74710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmputwhfvf7 /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/AnsiballZ_systemd.py <<< 41175 1727204668.74714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/AnsiballZ_systemd.py" <<< 41175 1727204668.74743: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmputwhfvf7" to remote "/root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/AnsiballZ_systemd.py" <<< 41175 1727204668.76429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204668.76498: stderr chunk (state=3): >>><<< 41175 1727204668.76502: stdout chunk (state=3): >>><<< 41175 1727204668.76526: done transferring module to remote 41175 1727204668.76537: _low_level_execute_command(): starting 41175 1727204668.76542: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/ /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/AnsiballZ_systemd.py && sleep 0' 41175 1727204668.77014: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204668.77018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204668.77021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.77023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204668.77025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.77082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204668.77087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204668.77121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204668.78981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204668.79031: stderr chunk (state=3): >>><<< 41175 1727204668.79034: stdout chunk (state=3): >>><<< 41175 1727204668.79049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204668.79053: _low_level_execute_command(): starting 41175 1727204668.79059: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/AnsiballZ_systemd.py && sleep 0' 41175 1727204668.79498: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204668.79501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.79504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204668.79506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204668.79558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204668.79566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204668.79611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204669.12075: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11894784", "MemoryAvailable": "infinity", "CPUUsageNSec": "1930361000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 41175 1727204669.12115: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 41175 1727204669.12125: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41175 1727204669.14068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204669.14135: stderr chunk (state=3): >>><<< 41175 1727204669.14138: stdout chunk (state=3): >>><<< 41175 1727204669.14157: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11894784", "MemoryAvailable": "infinity", "CPUUsageNSec": "1930361000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204669.14331: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204669.14352: _low_level_execute_command(): starting 41175 1727204669.14361: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204668.6921225-42853-186285732526455/ > /dev/null 2>&1 && sleep 0' 41175 1727204669.14860: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204669.14866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204669.14869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.14871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204669.14874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204669.14876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.14928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204669.14932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204669.14974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204669.16877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204669.16932: stderr chunk (state=3): >>><<< 41175 1727204669.16935: stdout chunk (state=3): >>><<< 41175 1727204669.16950: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204669.16959: handler run complete 41175 1727204669.17010: attempt loop complete, returning result 41175 1727204669.17014: _execute() done 41175 1727204669.17016: dumping result to json 41175 1727204669.17036: done dumping result, returning 41175 1727204669.17046: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-f070-39c4-0000000000c3] 41175 1727204669.17051: sending task result for task 12b410aa-8751-f070-39c4-0000000000c3 41175 1727204669.17352: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c3 41175 1727204669.17355: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204669.17414: no more pending results, returning what we have 41175 1727204669.17420: results queue empty 41175 1727204669.17421: checking for any_errors_fatal 41175 1727204669.17427: done checking for any_errors_fatal 41175 1727204669.17428: checking for max_fail_percentage 41175 1727204669.17430: done checking for max_fail_percentage 41175 1727204669.17431: checking to see if all hosts have failed and the running result is not ok 41175 1727204669.17432: done checking to see if all hosts have failed 41175 1727204669.17433: getting the remaining hosts for this loop 41175 1727204669.17435: done getting the remaining hosts for this loop 41175 1727204669.17439: getting the next task for host managed-node3 41175 1727204669.17445: done getting next task for host managed-node3 41175 1727204669.17449: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204669.17452: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204669.17462: getting variables 41175 1727204669.17466: in VariableManager get_vars() 41175 1727204669.17512: Calling all_inventory to load vars for managed-node3 41175 1727204669.17515: Calling groups_inventory to load vars for managed-node3 41175 1727204669.17520: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204669.17531: Calling all_plugins_play to load vars for managed-node3 41175 1727204669.17534: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204669.17537: Calling groups_plugins_play to load vars for managed-node3 41175 1727204669.18921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204669.20569: done with get_vars() 41175 1727204669.20608: done getting variables 41175 1727204669.20678: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.656) 0:00:36.346 ***** 41175 1727204669.20715: entering _queue_task() for managed-node3/service 41175 1727204669.21095: worker is 1 (out of 1 available) 41175 1727204669.21111: exiting _queue_task() for managed-node3/service 41175 1727204669.21124: done queuing things up, now waiting for results queue to drain 41175 1727204669.21126: waiting for pending results... 41175 1727204669.21622: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204669.21628: in run() - task 12b410aa-8751-f070-39c4-0000000000c4 41175 1727204669.21632: variable 'ansible_search_path' from source: unknown 41175 1727204669.21635: variable 'ansible_search_path' from source: unknown 41175 1727204669.21665: calling self._execute() 41175 1727204669.21796: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204669.21808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204669.21823: variable 'omit' from source: magic vars 41175 1727204669.22187: variable 'ansible_distribution_major_version' from source: facts 41175 1727204669.22201: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204669.22308: variable 'network_provider' from source: set_fact 41175 1727204669.22313: Evaluated conditional (network_provider == "nm"): True 41175 1727204669.22400: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204669.22476: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204669.22641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204669.24359: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204669.24423: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204669.24456: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204669.24494: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204669.24521: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204669.24601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204669.24626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204669.24648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204669.24685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204669.24700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204669.24742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204669.24762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204669.24786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204669.24883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204669.24887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204669.24892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204669.24895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204669.24909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204669.24939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204669.24952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204669.25075: variable 'network_connections' from source: play vars 41175 1727204669.25086: variable 'profile' from source: play vars 41175 1727204669.25152: variable 'profile' from source: play vars 41175 1727204669.25156: variable 'interface' from source: set_fact 41175 1727204669.25209: variable 'interface' from source: set_fact 41175 1727204669.25272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204669.25406: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204669.25441: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204669.25470: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204669.25496: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204669.25534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204669.25553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204669.25577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204669.25600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204669.25675: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204669.25860: variable 'network_connections' from source: play vars 41175 1727204669.25863: variable 'profile' from source: play vars 41175 1727204669.25918: variable 'profile' from source: play vars 41175 1727204669.25922: variable 'interface' from source: set_fact 41175 1727204669.25972: variable 'interface' from source: set_fact 41175 1727204669.26001: Evaluated conditional (__network_wpa_supplicant_required): False 41175 1727204669.26005: when evaluation is False, skipping this task 41175 1727204669.26007: _execute() done 41175 1727204669.26021: dumping result to json 41175 1727204669.26024: done dumping result, returning 41175 1727204669.26027: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-f070-39c4-0000000000c4] 41175 1727204669.26029: sending task result for task 12b410aa-8751-f070-39c4-0000000000c4 41175 1727204669.26129: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c4 41175 1727204669.26132: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41175 1727204669.26183: no more pending results, returning what we have 41175 1727204669.26187: results queue empty 41175 1727204669.26188: checking for any_errors_fatal 41175 1727204669.26216: done checking for any_errors_fatal 41175 1727204669.26219: checking for max_fail_percentage 41175 1727204669.26221: done checking for max_fail_percentage 41175 1727204669.26222: checking to see if all hosts have failed and the running result is not ok 41175 1727204669.26223: done checking to see if all hosts have failed 41175 1727204669.26224: getting the remaining hosts for this loop 41175 1727204669.26226: done getting the remaining hosts for this loop 41175 1727204669.26231: getting the next task for host managed-node3 41175 1727204669.26238: done getting next task for host managed-node3 41175 1727204669.26242: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204669.26244: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204669.26259: getting variables 41175 1727204669.26261: in VariableManager get_vars() 41175 1727204669.26308: Calling all_inventory to load vars for managed-node3 41175 1727204669.26311: Calling groups_inventory to load vars for managed-node3 41175 1727204669.26314: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204669.26328: Calling all_plugins_play to load vars for managed-node3 41175 1727204669.26331: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204669.26334: Calling groups_plugins_play to load vars for managed-node3 41175 1727204669.28207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204669.30918: done with get_vars() 41175 1727204669.30946: done getting variables 41175 1727204669.31000: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.103) 0:00:36.449 ***** 41175 1727204669.31028: entering _queue_task() for managed-node3/service 41175 1727204669.31298: worker is 1 (out of 1 available) 41175 1727204669.31315: exiting _queue_task() for managed-node3/service 41175 1727204669.31331: done queuing things up, now waiting for results queue to drain 41175 1727204669.31333: waiting for pending results... 41175 1727204669.31531: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204669.31615: in run() - task 12b410aa-8751-f070-39c4-0000000000c5 41175 1727204669.31629: variable 'ansible_search_path' from source: unknown 41175 1727204669.31632: variable 'ansible_search_path' from source: unknown 41175 1727204669.31667: calling self._execute() 41175 1727204669.31757: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204669.31764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204669.31774: variable 'omit' from source: magic vars 41175 1727204669.32102: variable 'ansible_distribution_major_version' from source: facts 41175 1727204669.32122: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204669.32223: variable 'network_provider' from source: set_fact 41175 1727204669.32226: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204669.32229: when evaluation is False, skipping this task 41175 1727204669.32233: _execute() done 41175 1727204669.32236: dumping result to json 41175 1727204669.32248: done dumping result, returning 41175 1727204669.32253: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-f070-39c4-0000000000c5] 41175 1727204669.32284: sending task result for task 12b410aa-8751-f070-39c4-0000000000c5 41175 1727204669.32371: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c5 41175 1727204669.32374: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204669.32456: no more pending results, returning what we have 41175 1727204669.32462: results queue empty 41175 1727204669.32463: checking for any_errors_fatal 41175 1727204669.32472: done checking for any_errors_fatal 41175 1727204669.32473: checking for max_fail_percentage 41175 1727204669.32475: done checking for max_fail_percentage 41175 1727204669.32476: checking to see if all hosts have failed and the running result is not ok 41175 1727204669.32477: done checking to see if all hosts have failed 41175 1727204669.32478: getting the remaining hosts for this loop 41175 1727204669.32480: done getting the remaining hosts for this loop 41175 1727204669.32484: getting the next task for host managed-node3 41175 1727204669.32493: done getting next task for host managed-node3 41175 1727204669.32498: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204669.32501: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204669.32521: getting variables 41175 1727204669.32523: in VariableManager get_vars() 41175 1727204669.32563: Calling all_inventory to load vars for managed-node3 41175 1727204669.32567: Calling groups_inventory to load vars for managed-node3 41175 1727204669.32569: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204669.32582: Calling all_plugins_play to load vars for managed-node3 41175 1727204669.32586: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204669.32733: Calling groups_plugins_play to load vars for managed-node3 41175 1727204669.34437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204669.36469: done with get_vars() 41175 1727204669.36509: done getting variables 41175 1727204669.36576: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.055) 0:00:36.505 ***** 41175 1727204669.36614: entering _queue_task() for managed-node3/copy 41175 1727204669.36976: worker is 1 (out of 1 available) 41175 1727204669.37009: exiting _queue_task() for managed-node3/copy 41175 1727204669.37022: done queuing things up, now waiting for results queue to drain 41175 1727204669.37024: waiting for pending results... 41175 1727204669.37246: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204669.37332: in run() - task 12b410aa-8751-f070-39c4-0000000000c6 41175 1727204669.37345: variable 'ansible_search_path' from source: unknown 41175 1727204669.37348: variable 'ansible_search_path' from source: unknown 41175 1727204669.37382: calling self._execute() 41175 1727204669.37476: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204669.37481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204669.37493: variable 'omit' from source: magic vars 41175 1727204669.37829: variable 'ansible_distribution_major_version' from source: facts 41175 1727204669.37841: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204669.37946: variable 'network_provider' from source: set_fact 41175 1727204669.37953: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204669.37956: when evaluation is False, skipping this task 41175 1727204669.37960: _execute() done 41175 1727204669.37966: dumping result to json 41175 1727204669.37969: done dumping result, returning 41175 1727204669.37980: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-f070-39c4-0000000000c6] 41175 1727204669.37986: sending task result for task 12b410aa-8751-f070-39c4-0000000000c6 41175 1727204669.38084: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c6 41175 1727204669.38087: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41175 1727204669.38169: no more pending results, returning what we have 41175 1727204669.38173: results queue empty 41175 1727204669.38175: checking for any_errors_fatal 41175 1727204669.38180: done checking for any_errors_fatal 41175 1727204669.38181: checking for max_fail_percentage 41175 1727204669.38182: done checking for max_fail_percentage 41175 1727204669.38183: checking to see if all hosts have failed and the running result is not ok 41175 1727204669.38184: done checking to see if all hosts have failed 41175 1727204669.38185: getting the remaining hosts for this loop 41175 1727204669.38186: done getting the remaining hosts for this loop 41175 1727204669.38193: getting the next task for host managed-node3 41175 1727204669.38198: done getting next task for host managed-node3 41175 1727204669.38203: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204669.38206: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204669.38220: getting variables 41175 1727204669.38222: in VariableManager get_vars() 41175 1727204669.38255: Calling all_inventory to load vars for managed-node3 41175 1727204669.38258: Calling groups_inventory to load vars for managed-node3 41175 1727204669.38260: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204669.38271: Calling all_plugins_play to load vars for managed-node3 41175 1727204669.38274: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204669.38277: Calling groups_plugins_play to load vars for managed-node3 41175 1727204669.40322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204669.42259: done with get_vars() 41175 1727204669.42288: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.057) 0:00:36.562 ***** 41175 1727204669.42362: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204669.42630: worker is 1 (out of 1 available) 41175 1727204669.42644: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204669.42658: done queuing things up, now waiting for results queue to drain 41175 1727204669.42659: waiting for pending results... 41175 1727204669.43009: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204669.43014: in run() - task 12b410aa-8751-f070-39c4-0000000000c7 41175 1727204669.43041: variable 'ansible_search_path' from source: unknown 41175 1727204669.43051: variable 'ansible_search_path' from source: unknown 41175 1727204669.43099: calling self._execute() 41175 1727204669.43252: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204669.43355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204669.43359: variable 'omit' from source: magic vars 41175 1727204669.43776: variable 'ansible_distribution_major_version' from source: facts 41175 1727204669.43813: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204669.43864: variable 'omit' from source: magic vars 41175 1727204669.43877: variable 'omit' from source: magic vars 41175 1727204669.44030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204669.45780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204669.45839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204669.45873: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204669.45907: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204669.45932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204669.46006: variable 'network_provider' from source: set_fact 41175 1727204669.46121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204669.46158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204669.46182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204669.46220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204669.46236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204669.46300: variable 'omit' from source: magic vars 41175 1727204669.46397: variable 'omit' from source: magic vars 41175 1727204669.46486: variable 'network_connections' from source: play vars 41175 1727204669.46501: variable 'profile' from source: play vars 41175 1727204669.46561: variable 'profile' from source: play vars 41175 1727204669.46565: variable 'interface' from source: set_fact 41175 1727204669.46619: variable 'interface' from source: set_fact 41175 1727204669.46747: variable 'omit' from source: magic vars 41175 1727204669.46753: variable '__lsr_ansible_managed' from source: task vars 41175 1727204669.46806: variable '__lsr_ansible_managed' from source: task vars 41175 1727204669.47043: Loaded config def from plugin (lookup/template) 41175 1727204669.47048: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41175 1727204669.47076: File lookup term: get_ansible_managed.j2 41175 1727204669.47080: variable 'ansible_search_path' from source: unknown 41175 1727204669.47085: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41175 1727204669.47099: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41175 1727204669.47113: variable 'ansible_search_path' from source: unknown 41175 1727204669.52944: variable 'ansible_managed' from source: unknown 41175 1727204669.53081: variable 'omit' from source: magic vars 41175 1727204669.53107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204669.53136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204669.53158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204669.53174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204669.53184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204669.53211: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204669.53215: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204669.53222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204669.53307: Set connection var ansible_shell_executable to /bin/sh 41175 1727204669.53310: Set connection var ansible_shell_type to sh 41175 1727204669.53316: Set connection var ansible_pipelining to False 41175 1727204669.53327: Set connection var ansible_timeout to 10 41175 1727204669.53335: Set connection var ansible_connection to ssh 41175 1727204669.53340: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204669.53364: variable 'ansible_shell_executable' from source: unknown 41175 1727204669.53367: variable 'ansible_connection' from source: unknown 41175 1727204669.53370: variable 'ansible_module_compression' from source: unknown 41175 1727204669.53374: variable 'ansible_shell_type' from source: unknown 41175 1727204669.53378: variable 'ansible_shell_executable' from source: unknown 41175 1727204669.53382: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204669.53387: variable 'ansible_pipelining' from source: unknown 41175 1727204669.53392: variable 'ansible_timeout' from source: unknown 41175 1727204669.53398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204669.53514: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204669.53526: variable 'omit' from source: magic vars 41175 1727204669.53534: starting attempt loop 41175 1727204669.53538: running the handler 41175 1727204669.53551: _low_level_execute_command(): starting 41175 1727204669.53558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204669.54107: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204669.54111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.54114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204669.54118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.54174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204669.54178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204669.54180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204669.54230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204669.55991: stdout chunk (state=3): >>>/root <<< 41175 1727204669.56100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204669.56158: stderr chunk (state=3): >>><<< 41175 1727204669.56161: stdout chunk (state=3): >>><<< 41175 1727204669.56191: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204669.56202: _low_level_execute_command(): starting 41175 1727204669.56209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108 `" && echo ansible-tmp-1727204669.5619073-42885-106932209191108="` echo /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108 `" ) && sleep 0' 41175 1727204669.56702: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204669.56705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204669.56710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204669.56713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204669.56715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.56766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204669.56770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204669.56815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204669.58831: stdout chunk (state=3): >>>ansible-tmp-1727204669.5619073-42885-106932209191108=/root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108 <<< 41175 1727204669.58947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204669.59007: stderr chunk (state=3): >>><<< 41175 1727204669.59011: stdout chunk (state=3): >>><<< 41175 1727204669.59028: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204669.5619073-42885-106932209191108=/root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204669.59070: variable 'ansible_module_compression' from source: unknown 41175 1727204669.59121: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41175 1727204669.59146: variable 'ansible_facts' from source: unknown 41175 1727204669.59223: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/AnsiballZ_network_connections.py 41175 1727204669.59347: Sending initial data 41175 1727204669.59350: Sent initial data (168 bytes) 41175 1727204669.59827: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204669.59831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204669.59839: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.59841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204669.59844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.59895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204669.59899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204669.59940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204669.61563: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204669.61611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204669.61651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp19vedlqx /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/AnsiballZ_network_connections.py <<< 41175 1727204669.61655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/AnsiballZ_network_connections.py" <<< 41175 1727204669.61725: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp19vedlqx" to remote "/root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/AnsiballZ_network_connections.py" <<< 41175 1727204669.63397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204669.63479: stderr chunk (state=3): >>><<< 41175 1727204669.63530: stdout chunk (state=3): >>><<< 41175 1727204669.63573: done transferring module to remote 41175 1727204669.63593: _low_level_execute_command(): starting 41175 1727204669.63604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/ /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/AnsiballZ_network_connections.py && sleep 0' 41175 1727204669.64292: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204669.64310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204669.64339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204669.64445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204669.64461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.64501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204669.64529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204669.64605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204669.64623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204669.66449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204669.66498: stderr chunk (state=3): >>><<< 41175 1727204669.66502: stdout chunk (state=3): >>><<< 41175 1727204669.66521: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204669.66525: _low_level_execute_command(): starting 41175 1727204669.66527: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/AnsiballZ_network_connections.py && sleep 0' 41175 1727204669.66979: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204669.66982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.66985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204669.66987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204669.66991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204669.67042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204669.67053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204669.67091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.00278: stdout chunk (state=3): >>> <<< 41175 1727204670.00284: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 41175 1727204670.00359: stdout chunk (state=3): >>> <<< 41175 1727204670.02499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204670.02558: stderr chunk (state=3): >>><<< 41175 1727204670.02562: stdout chunk (state=3): >>><<< 41175 1727204670.02580: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204670.02620: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204670.02629: _low_level_execute_command(): starting 41175 1727204670.02636: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204669.5619073-42885-106932209191108/ > /dev/null 2>&1 && sleep 0' 41175 1727204670.03097: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204670.03102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204670.03116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204670.03132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.03237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204670.03275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.05401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204670.05404: stdout chunk (state=3): >>><<< 41175 1727204670.05407: stderr chunk (state=3): >>><<< 41175 1727204670.05410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204670.05412: handler run complete 41175 1727204670.05415: attempt loop complete, returning result 41175 1727204670.05420: _execute() done 41175 1727204670.05422: dumping result to json 41175 1727204670.05424: done dumping result, returning 41175 1727204670.05427: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-f070-39c4-0000000000c7] 41175 1727204670.05429: sending task result for task 12b410aa-8751-f070-39c4-0000000000c7 41175 1727204670.05797: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c7 41175 1727204670.05800: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 41175 1727204670.05931: no more pending results, returning what we have 41175 1727204670.05935: results queue empty 41175 1727204670.05936: checking for any_errors_fatal 41175 1727204670.05945: done checking for any_errors_fatal 41175 1727204670.05946: checking for max_fail_percentage 41175 1727204670.05949: done checking for max_fail_percentage 41175 1727204670.05950: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.05951: done checking to see if all hosts have failed 41175 1727204670.05952: getting the remaining hosts for this loop 41175 1727204670.05954: done getting the remaining hosts for this loop 41175 1727204670.05959: getting the next task for host managed-node3 41175 1727204670.05965: done getting next task for host managed-node3 41175 1727204670.05969: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204670.05972: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.05984: getting variables 41175 1727204670.05986: in VariableManager get_vars() 41175 1727204670.06036: Calling all_inventory to load vars for managed-node3 41175 1727204670.06040: Calling groups_inventory to load vars for managed-node3 41175 1727204670.06042: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.06055: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.06059: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.06063: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.08760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.12033: done with get_vars() 41175 1727204670.12084: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.698) 0:00:37.260 ***** 41175 1727204670.12186: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204670.12723: worker is 1 (out of 1 available) 41175 1727204670.12743: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204670.12755: done queuing things up, now waiting for results queue to drain 41175 1727204670.12757: waiting for pending results... 41175 1727204670.13027: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204670.13168: in run() - task 12b410aa-8751-f070-39c4-0000000000c8 41175 1727204670.13292: variable 'ansible_search_path' from source: unknown 41175 1727204670.13296: variable 'ansible_search_path' from source: unknown 41175 1727204670.13301: calling self._execute() 41175 1727204670.13400: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.13420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.13440: variable 'omit' from source: magic vars 41175 1727204670.13945: variable 'ansible_distribution_major_version' from source: facts 41175 1727204670.14055: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204670.14143: variable 'network_state' from source: role '' defaults 41175 1727204670.14170: Evaluated conditional (network_state != {}): False 41175 1727204670.14183: when evaluation is False, skipping this task 41175 1727204670.14195: _execute() done 41175 1727204670.14204: dumping result to json 41175 1727204670.14213: done dumping result, returning 41175 1727204670.14230: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-f070-39c4-0000000000c8] 41175 1727204670.14272: sending task result for task 12b410aa-8751-f070-39c4-0000000000c8 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204670.14645: no more pending results, returning what we have 41175 1727204670.14650: results queue empty 41175 1727204670.14651: checking for any_errors_fatal 41175 1727204670.14660: done checking for any_errors_fatal 41175 1727204670.14661: checking for max_fail_percentage 41175 1727204670.14664: done checking for max_fail_percentage 41175 1727204670.14665: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.14666: done checking to see if all hosts have failed 41175 1727204670.14667: getting the remaining hosts for this loop 41175 1727204670.14669: done getting the remaining hosts for this loop 41175 1727204670.14673: getting the next task for host managed-node3 41175 1727204670.14679: done getting next task for host managed-node3 41175 1727204670.14683: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204670.14686: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.14704: getting variables 41175 1727204670.14706: in VariableManager get_vars() 41175 1727204670.14750: Calling all_inventory to load vars for managed-node3 41175 1727204670.14754: Calling groups_inventory to load vars for managed-node3 41175 1727204670.14757: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.14769: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.14773: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.14777: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.15306: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c8 41175 1727204670.15309: WORKER PROCESS EXITING 41175 1727204670.17037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.18713: done with get_vars() 41175 1727204670.18750: done getting variables 41175 1727204670.18829: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.066) 0:00:37.327 ***** 41175 1727204670.18865: entering _queue_task() for managed-node3/debug 41175 1727204670.19237: worker is 1 (out of 1 available) 41175 1727204670.19253: exiting _queue_task() for managed-node3/debug 41175 1727204670.19266: done queuing things up, now waiting for results queue to drain 41175 1727204670.19268: waiting for pending results... 41175 1727204670.19594: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204670.19739: in run() - task 12b410aa-8751-f070-39c4-0000000000c9 41175 1727204670.19765: variable 'ansible_search_path' from source: unknown 41175 1727204670.19778: variable 'ansible_search_path' from source: unknown 41175 1727204670.19843: calling self._execute() 41175 1727204670.19935: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.19943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.19954: variable 'omit' from source: magic vars 41175 1727204670.20294: variable 'ansible_distribution_major_version' from source: facts 41175 1727204670.20307: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204670.20314: variable 'omit' from source: magic vars 41175 1727204670.20351: variable 'omit' from source: magic vars 41175 1727204670.20382: variable 'omit' from source: magic vars 41175 1727204670.20424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204670.20455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204670.20475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204670.20492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204670.20507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204670.20536: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204670.20540: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.20544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.20635: Set connection var ansible_shell_executable to /bin/sh 41175 1727204670.20638: Set connection var ansible_shell_type to sh 41175 1727204670.20645: Set connection var ansible_pipelining to False 41175 1727204670.20654: Set connection var ansible_timeout to 10 41175 1727204670.20660: Set connection var ansible_connection to ssh 41175 1727204670.20667: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204670.20685: variable 'ansible_shell_executable' from source: unknown 41175 1727204670.20690: variable 'ansible_connection' from source: unknown 41175 1727204670.20693: variable 'ansible_module_compression' from source: unknown 41175 1727204670.20699: variable 'ansible_shell_type' from source: unknown 41175 1727204670.20702: variable 'ansible_shell_executable' from source: unknown 41175 1727204670.20704: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.20710: variable 'ansible_pipelining' from source: unknown 41175 1727204670.20712: variable 'ansible_timeout' from source: unknown 41175 1727204670.20723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.20847: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204670.20858: variable 'omit' from source: magic vars 41175 1727204670.20864: starting attempt loop 41175 1727204670.20868: running the handler 41175 1727204670.20981: variable '__network_connections_result' from source: set_fact 41175 1727204670.21034: handler run complete 41175 1727204670.21053: attempt loop complete, returning result 41175 1727204670.21057: _execute() done 41175 1727204670.21061: dumping result to json 41175 1727204670.21063: done dumping result, returning 41175 1727204670.21073: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-f070-39c4-0000000000c9] 41175 1727204670.21079: sending task result for task 12b410aa-8751-f070-39c4-0000000000c9 41175 1727204670.21174: done sending task result for task 12b410aa-8751-f070-39c4-0000000000c9 41175 1727204670.21177: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 41175 1727204670.21244: no more pending results, returning what we have 41175 1727204670.21248: results queue empty 41175 1727204670.21249: checking for any_errors_fatal 41175 1727204670.21259: done checking for any_errors_fatal 41175 1727204670.21260: checking for max_fail_percentage 41175 1727204670.21261: done checking for max_fail_percentage 41175 1727204670.21262: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.21264: done checking to see if all hosts have failed 41175 1727204670.21265: getting the remaining hosts for this loop 41175 1727204670.21267: done getting the remaining hosts for this loop 41175 1727204670.21272: getting the next task for host managed-node3 41175 1727204670.21278: done getting next task for host managed-node3 41175 1727204670.21281: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204670.21283: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.21302: getting variables 41175 1727204670.21305: in VariableManager get_vars() 41175 1727204670.21344: Calling all_inventory to load vars for managed-node3 41175 1727204670.21347: Calling groups_inventory to load vars for managed-node3 41175 1727204670.21350: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.21361: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.21364: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.21367: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.23347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.25116: done with get_vars() 41175 1727204670.25149: done getting variables 41175 1727204670.25203: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.063) 0:00:37.391 ***** 41175 1727204670.25230: entering _queue_task() for managed-node3/debug 41175 1727204670.25821: worker is 1 (out of 1 available) 41175 1727204670.25833: exiting _queue_task() for managed-node3/debug 41175 1727204670.25844: done queuing things up, now waiting for results queue to drain 41175 1727204670.25846: waiting for pending results... 41175 1727204670.26084: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204670.26091: in run() - task 12b410aa-8751-f070-39c4-0000000000ca 41175 1727204670.26095: variable 'ansible_search_path' from source: unknown 41175 1727204670.26098: variable 'ansible_search_path' from source: unknown 41175 1727204670.26138: calling self._execute() 41175 1727204670.26252: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.26267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.26296: variable 'omit' from source: magic vars 41175 1727204670.26734: variable 'ansible_distribution_major_version' from source: facts 41175 1727204670.26757: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204670.26770: variable 'omit' from source: magic vars 41175 1727204670.26827: variable 'omit' from source: magic vars 41175 1727204670.26878: variable 'omit' from source: magic vars 41175 1727204670.26936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204670.27096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204670.27100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204670.27102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204670.27105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204670.27108: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204670.27110: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.27112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.27245: Set connection var ansible_shell_executable to /bin/sh 41175 1727204670.27254: Set connection var ansible_shell_type to sh 41175 1727204670.27266: Set connection var ansible_pipelining to False 41175 1727204670.27281: Set connection var ansible_timeout to 10 41175 1727204670.27296: Set connection var ansible_connection to ssh 41175 1727204670.27309: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204670.27343: variable 'ansible_shell_executable' from source: unknown 41175 1727204670.27352: variable 'ansible_connection' from source: unknown 41175 1727204670.27444: variable 'ansible_module_compression' from source: unknown 41175 1727204670.27447: variable 'ansible_shell_type' from source: unknown 41175 1727204670.27450: variable 'ansible_shell_executable' from source: unknown 41175 1727204670.27452: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.27454: variable 'ansible_pipelining' from source: unknown 41175 1727204670.27457: variable 'ansible_timeout' from source: unknown 41175 1727204670.27459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.27580: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204670.27603: variable 'omit' from source: magic vars 41175 1727204670.27616: starting attempt loop 41175 1727204670.27625: running the handler 41175 1727204670.27691: variable '__network_connections_result' from source: set_fact 41175 1727204670.27797: variable '__network_connections_result' from source: set_fact 41175 1727204670.27909: handler run complete 41175 1727204670.27934: attempt loop complete, returning result 41175 1727204670.27937: _execute() done 41175 1727204670.27940: dumping result to json 41175 1727204670.27945: done dumping result, returning 41175 1727204670.27955: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-f070-39c4-0000000000ca] 41175 1727204670.27961: sending task result for task 12b410aa-8751-f070-39c4-0000000000ca 41175 1727204670.28064: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ca 41175 1727204670.28067: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 41175 1727204670.28181: no more pending results, returning what we have 41175 1727204670.28185: results queue empty 41175 1727204670.28188: checking for any_errors_fatal 41175 1727204670.28193: done checking for any_errors_fatal 41175 1727204670.28194: checking for max_fail_percentage 41175 1727204670.28195: done checking for max_fail_percentage 41175 1727204670.28196: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.28199: done checking to see if all hosts have failed 41175 1727204670.28199: getting the remaining hosts for this loop 41175 1727204670.28202: done getting the remaining hosts for this loop 41175 1727204670.28206: getting the next task for host managed-node3 41175 1727204670.28211: done getting next task for host managed-node3 41175 1727204670.28215: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204670.28217: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.28227: getting variables 41175 1727204670.28229: in VariableManager get_vars() 41175 1727204670.28264: Calling all_inventory to load vars for managed-node3 41175 1727204670.28267: Calling groups_inventory to load vars for managed-node3 41175 1727204670.28269: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.28279: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.28282: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.28285: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.29557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.31705: done with get_vars() 41175 1727204670.31731: done getting variables 41175 1727204670.31783: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.065) 0:00:37.457 ***** 41175 1727204670.31824: entering _queue_task() for managed-node3/debug 41175 1727204670.32081: worker is 1 (out of 1 available) 41175 1727204670.32097: exiting _queue_task() for managed-node3/debug 41175 1727204670.32110: done queuing things up, now waiting for results queue to drain 41175 1727204670.32112: waiting for pending results... 41175 1727204670.32312: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204670.32399: in run() - task 12b410aa-8751-f070-39c4-0000000000cb 41175 1727204670.32413: variable 'ansible_search_path' from source: unknown 41175 1727204670.32416: variable 'ansible_search_path' from source: unknown 41175 1727204670.32452: calling self._execute() 41175 1727204670.32539: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.32545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.32555: variable 'omit' from source: magic vars 41175 1727204670.32893: variable 'ansible_distribution_major_version' from source: facts 41175 1727204670.32902: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204670.33010: variable 'network_state' from source: role '' defaults 41175 1727204670.33023: Evaluated conditional (network_state != {}): False 41175 1727204670.33026: when evaluation is False, skipping this task 41175 1727204670.33029: _execute() done 41175 1727204670.33034: dumping result to json 41175 1727204670.33039: done dumping result, returning 41175 1727204670.33047: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-f070-39c4-0000000000cb] 41175 1727204670.33054: sending task result for task 12b410aa-8751-f070-39c4-0000000000cb 41175 1727204670.33152: done sending task result for task 12b410aa-8751-f070-39c4-0000000000cb 41175 1727204670.33155: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41175 1727204670.33222: no more pending results, returning what we have 41175 1727204670.33227: results queue empty 41175 1727204670.33228: checking for any_errors_fatal 41175 1727204670.33235: done checking for any_errors_fatal 41175 1727204670.33236: checking for max_fail_percentage 41175 1727204670.33238: done checking for max_fail_percentage 41175 1727204670.33239: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.33240: done checking to see if all hosts have failed 41175 1727204670.33241: getting the remaining hosts for this loop 41175 1727204670.33243: done getting the remaining hosts for this loop 41175 1727204670.33247: getting the next task for host managed-node3 41175 1727204670.33254: done getting next task for host managed-node3 41175 1727204670.33258: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204670.33260: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.33276: getting variables 41175 1727204670.33278: in VariableManager get_vars() 41175 1727204670.33314: Calling all_inventory to load vars for managed-node3 41175 1727204670.33319: Calling groups_inventory to load vars for managed-node3 41175 1727204670.33322: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.33332: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.33335: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.33338: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.34681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.36303: done with get_vars() 41175 1727204670.36329: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.045) 0:00:37.502 ***** 41175 1727204670.36405: entering _queue_task() for managed-node3/ping 41175 1727204670.36660: worker is 1 (out of 1 available) 41175 1727204670.36677: exiting _queue_task() for managed-node3/ping 41175 1727204670.36691: done queuing things up, now waiting for results queue to drain 41175 1727204670.36693: waiting for pending results... 41175 1727204670.36884: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204670.36974: in run() - task 12b410aa-8751-f070-39c4-0000000000cc 41175 1727204670.36988: variable 'ansible_search_path' from source: unknown 41175 1727204670.36993: variable 'ansible_search_path' from source: unknown 41175 1727204670.37028: calling self._execute() 41175 1727204670.37111: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.37120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.37128: variable 'omit' from source: magic vars 41175 1727204670.37460: variable 'ansible_distribution_major_version' from source: facts 41175 1727204670.37476: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204670.37481: variable 'omit' from source: magic vars 41175 1727204670.37522: variable 'omit' from source: magic vars 41175 1727204670.37553: variable 'omit' from source: magic vars 41175 1727204670.37593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204670.37626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204670.37643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204670.37659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204670.37671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204670.37703: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204670.37708: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.37710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.37797: Set connection var ansible_shell_executable to /bin/sh 41175 1727204670.37802: Set connection var ansible_shell_type to sh 41175 1727204670.37805: Set connection var ansible_pipelining to False 41175 1727204670.37816: Set connection var ansible_timeout to 10 41175 1727204670.37822: Set connection var ansible_connection to ssh 41175 1727204670.37830: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204670.37849: variable 'ansible_shell_executable' from source: unknown 41175 1727204670.37852: variable 'ansible_connection' from source: unknown 41175 1727204670.37855: variable 'ansible_module_compression' from source: unknown 41175 1727204670.37860: variable 'ansible_shell_type' from source: unknown 41175 1727204670.37862: variable 'ansible_shell_executable' from source: unknown 41175 1727204670.37867: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204670.37872: variable 'ansible_pipelining' from source: unknown 41175 1727204670.37875: variable 'ansible_timeout' from source: unknown 41175 1727204670.37881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204670.38060: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204670.38071: variable 'omit' from source: magic vars 41175 1727204670.38077: starting attempt loop 41175 1727204670.38081: running the handler 41175 1727204670.38098: _low_level_execute_command(): starting 41175 1727204670.38105: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204670.38664: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204670.38668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.38671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204670.38673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.38738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204670.38746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204670.38749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204670.38786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.40551: stdout chunk (state=3): >>>/root <<< 41175 1727204670.40661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204670.40718: stderr chunk (state=3): >>><<< 41175 1727204670.40725: stdout chunk (state=3): >>><<< 41175 1727204670.40746: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204670.40759: _low_level_execute_command(): starting 41175 1727204670.40765: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909 `" && echo ansible-tmp-1727204670.4074564-42920-37635106708909="` echo /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909 `" ) && sleep 0' 41175 1727204670.41240: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204670.41244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204670.41246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.41256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204670.41259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.41310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204670.41315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204670.41318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204670.41352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.43338: stdout chunk (state=3): >>>ansible-tmp-1727204670.4074564-42920-37635106708909=/root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909 <<< 41175 1727204670.43457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204670.43507: stderr chunk (state=3): >>><<< 41175 1727204670.43510: stdout chunk (state=3): >>><<< 41175 1727204670.43529: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204670.4074564-42920-37635106708909=/root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204670.43575: variable 'ansible_module_compression' from source: unknown 41175 1727204670.43611: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41175 1727204670.43651: variable 'ansible_facts' from source: unknown 41175 1727204670.43704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/AnsiballZ_ping.py 41175 1727204670.43816: Sending initial data 41175 1727204670.43820: Sent initial data (152 bytes) 41175 1727204670.44294: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204670.44297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.44301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204670.44304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204670.44306: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.44357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204670.44361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204670.44402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.45988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41175 1727204670.45997: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204670.46022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204670.46059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpe_5vh05g /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/AnsiballZ_ping.py <<< 41175 1727204670.46067: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/AnsiballZ_ping.py" <<< 41175 1727204670.46098: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpe_5vh05g" to remote "/root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/AnsiballZ_ping.py" <<< 41175 1727204670.46821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204670.46881: stderr chunk (state=3): >>><<< 41175 1727204670.46884: stdout chunk (state=3): >>><<< 41175 1727204670.46909: done transferring module to remote 41175 1727204670.46922: _low_level_execute_command(): starting 41175 1727204670.46925: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/ /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/AnsiballZ_ping.py && sleep 0' 41175 1727204670.47386: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204670.47392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.47395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204670.47397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204670.47403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.47451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204670.47459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204670.47495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.49340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204670.49385: stderr chunk (state=3): >>><<< 41175 1727204670.49388: stdout chunk (state=3): >>><<< 41175 1727204670.49408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204670.49411: _low_level_execute_command(): starting 41175 1727204670.49419: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/AnsiballZ_ping.py && sleep 0' 41175 1727204670.49882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204670.49886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204670.49888: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204670.49897: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204670.49899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.49950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204670.49958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204670.49997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.67233: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41175 1727204670.68495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204670.68668: stderr chunk (state=3): >>><<< 41175 1727204670.68680: stdout chunk (state=3): >>><<< 41175 1727204670.68897: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204670.68902: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204670.68906: _low_level_execute_command(): starting 41175 1727204670.68908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204670.4074564-42920-37635106708909/ > /dev/null 2>&1 && sleep 0' 41175 1727204670.70311: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204670.70373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204670.70404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204670.70430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204670.70508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204670.72527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204670.72541: stdout chunk (state=3): >>><<< 41175 1727204670.72562: stderr chunk (state=3): >>><<< 41175 1727204670.72592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204670.72613: handler run complete 41175 1727204670.72638: attempt loop complete, returning result 41175 1727204670.72647: _execute() done 41175 1727204670.72663: dumping result to json 41175 1727204670.72678: done dumping result, returning 41175 1727204670.72699: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-f070-39c4-0000000000cc] 41175 1727204670.72712: sending task result for task 12b410aa-8751-f070-39c4-0000000000cc ok: [managed-node3] => { "changed": false, "ping": "pong" } 41175 1727204670.72907: no more pending results, returning what we have 41175 1727204670.72913: results queue empty 41175 1727204670.72914: checking for any_errors_fatal 41175 1727204670.72923: done checking for any_errors_fatal 41175 1727204670.72924: checking for max_fail_percentage 41175 1727204670.72926: done checking for max_fail_percentage 41175 1727204670.72928: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.72929: done checking to see if all hosts have failed 41175 1727204670.72930: getting the remaining hosts for this loop 41175 1727204670.72933: done getting the remaining hosts for this loop 41175 1727204670.72938: getting the next task for host managed-node3 41175 1727204670.72948: done getting next task for host managed-node3 41175 1727204670.72951: ^ task is: TASK: meta (role_complete) 41175 1727204670.72954: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.72967: getting variables 41175 1727204670.72970: in VariableManager get_vars() 41175 1727204670.73227: Calling all_inventory to load vars for managed-node3 41175 1727204670.73231: Calling groups_inventory to load vars for managed-node3 41175 1727204670.73234: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.73248: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.73252: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.73256: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.73820: done sending task result for task 12b410aa-8751-f070-39c4-0000000000cc 41175 1727204670.73824: WORKER PROCESS EXITING 41175 1727204670.76210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.79338: done with get_vars() 41175 1727204670.79379: done getting variables 41175 1727204670.79479: done queuing things up, now waiting for results queue to drain 41175 1727204670.79482: results queue empty 41175 1727204670.79483: checking for any_errors_fatal 41175 1727204670.79487: done checking for any_errors_fatal 41175 1727204670.79488: checking for max_fail_percentage 41175 1727204670.79491: done checking for max_fail_percentage 41175 1727204670.79492: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.79493: done checking to see if all hosts have failed 41175 1727204670.79494: getting the remaining hosts for this loop 41175 1727204670.79495: done getting the remaining hosts for this loop 41175 1727204670.79499: getting the next task for host managed-node3 41175 1727204670.79504: done getting next task for host managed-node3 41175 1727204670.79506: ^ task is: TASK: meta (flush_handlers) 41175 1727204670.79508: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.79511: getting variables 41175 1727204670.79513: in VariableManager get_vars() 41175 1727204670.79530: Calling all_inventory to load vars for managed-node3 41175 1727204670.79533: Calling groups_inventory to load vars for managed-node3 41175 1727204670.79536: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.79542: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.79545: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.79550: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.81526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.84533: done with get_vars() 41175 1727204670.84569: done getting variables 41175 1727204670.84635: in VariableManager get_vars() 41175 1727204670.84651: Calling all_inventory to load vars for managed-node3 41175 1727204670.84654: Calling groups_inventory to load vars for managed-node3 41175 1727204670.84657: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.84663: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.84666: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.84670: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.86853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.89857: done with get_vars() 41175 1727204670.89907: done queuing things up, now waiting for results queue to drain 41175 1727204670.89910: results queue empty 41175 1727204670.89911: checking for any_errors_fatal 41175 1727204670.89913: done checking for any_errors_fatal 41175 1727204670.89914: checking for max_fail_percentage 41175 1727204670.89915: done checking for max_fail_percentage 41175 1727204670.89919: checking to see if all hosts have failed and the running result is not ok 41175 1727204670.89920: done checking to see if all hosts have failed 41175 1727204670.89921: getting the remaining hosts for this loop 41175 1727204670.89922: done getting the remaining hosts for this loop 41175 1727204670.89926: getting the next task for host managed-node3 41175 1727204670.89931: done getting next task for host managed-node3 41175 1727204670.89933: ^ task is: TASK: meta (flush_handlers) 41175 1727204670.89935: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204670.89938: getting variables 41175 1727204670.89939: in VariableManager get_vars() 41175 1727204670.89955: Calling all_inventory to load vars for managed-node3 41175 1727204670.89958: Calling groups_inventory to load vars for managed-node3 41175 1727204670.89961: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.89967: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.89971: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.89975: Calling groups_plugins_play to load vars for managed-node3 41175 1727204670.96464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204670.99392: done with get_vars() 41175 1727204670.99440: done getting variables 41175 1727204670.99504: in VariableManager get_vars() 41175 1727204670.99522: Calling all_inventory to load vars for managed-node3 41175 1727204670.99524: Calling groups_inventory to load vars for managed-node3 41175 1727204670.99527: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204670.99533: Calling all_plugins_play to load vars for managed-node3 41175 1727204670.99535: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204670.99538: Calling groups_plugins_play to load vars for managed-node3 41175 1727204671.01599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204671.04701: done with get_vars() 41175 1727204671.04753: done queuing things up, now waiting for results queue to drain 41175 1727204671.04755: results queue empty 41175 1727204671.04757: checking for any_errors_fatal 41175 1727204671.04759: done checking for any_errors_fatal 41175 1727204671.04760: checking for max_fail_percentage 41175 1727204671.04761: done checking for max_fail_percentage 41175 1727204671.04762: checking to see if all hosts have failed and the running result is not ok 41175 1727204671.04763: done checking to see if all hosts have failed 41175 1727204671.04764: getting the remaining hosts for this loop 41175 1727204671.04765: done getting the remaining hosts for this loop 41175 1727204671.04769: getting the next task for host managed-node3 41175 1727204671.04773: done getting next task for host managed-node3 41175 1727204671.04774: ^ task is: None 41175 1727204671.04776: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204671.04777: done queuing things up, now waiting for results queue to drain 41175 1727204671.04778: results queue empty 41175 1727204671.04779: checking for any_errors_fatal 41175 1727204671.04780: done checking for any_errors_fatal 41175 1727204671.04781: checking for max_fail_percentage 41175 1727204671.04783: done checking for max_fail_percentage 41175 1727204671.04783: checking to see if all hosts have failed and the running result is not ok 41175 1727204671.04784: done checking to see if all hosts have failed 41175 1727204671.04786: getting the next task for host managed-node3 41175 1727204671.04790: done getting next task for host managed-node3 41175 1727204671.04791: ^ task is: None 41175 1727204671.04793: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204671.04837: in VariableManager get_vars() 41175 1727204671.04856: done with get_vars() 41175 1727204671.04864: in VariableManager get_vars() 41175 1727204671.04875: done with get_vars() 41175 1727204671.04880: variable 'omit' from source: magic vars 41175 1727204671.04921: in VariableManager get_vars() 41175 1727204671.04935: done with get_vars() 41175 1727204671.04960: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 41175 1727204671.05266: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41175 1727204671.05293: getting the remaining hosts for this loop 41175 1727204671.05295: done getting the remaining hosts for this loop 41175 1727204671.05299: getting the next task for host managed-node3 41175 1727204671.05302: done getting next task for host managed-node3 41175 1727204671.05305: ^ task is: TASK: Gathering Facts 41175 1727204671.05307: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204671.05309: getting variables 41175 1727204671.05310: in VariableManager get_vars() 41175 1727204671.05320: Calling all_inventory to load vars for managed-node3 41175 1727204671.05323: Calling groups_inventory to load vars for managed-node3 41175 1727204671.05326: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204671.05333: Calling all_plugins_play to load vars for managed-node3 41175 1727204671.05336: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204671.05340: Calling groups_plugins_play to load vars for managed-node3 41175 1727204671.07666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204671.10797: done with get_vars() 41175 1727204671.10833: done getting variables 41175 1727204671.10894: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 15:04:31 -0400 (0:00:00.745) 0:00:38.248 ***** 41175 1727204671.10923: entering _queue_task() for managed-node3/gather_facts 41175 1727204671.11288: worker is 1 (out of 1 available) 41175 1727204671.11304: exiting _queue_task() for managed-node3/gather_facts 41175 1727204671.11320: done queuing things up, now waiting for results queue to drain 41175 1727204671.11322: waiting for pending results... 41175 1727204671.11610: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41175 1727204671.11797: in run() - task 12b410aa-8751-f070-39c4-00000000076f 41175 1727204671.11801: variable 'ansible_search_path' from source: unknown 41175 1727204671.11804: calling self._execute() 41175 1727204671.11901: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204671.11916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204671.11935: variable 'omit' from source: magic vars 41175 1727204671.12399: variable 'ansible_distribution_major_version' from source: facts 41175 1727204671.12423: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204671.12438: variable 'omit' from source: magic vars 41175 1727204671.12476: variable 'omit' from source: magic vars 41175 1727204671.12534: variable 'omit' from source: magic vars 41175 1727204671.12587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204671.12695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204671.12698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204671.12701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204671.12711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204671.12755: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204671.12765: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204671.12775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204671.12901: Set connection var ansible_shell_executable to /bin/sh 41175 1727204671.12909: Set connection var ansible_shell_type to sh 41175 1727204671.12927: Set connection var ansible_pipelining to False 41175 1727204671.13095: Set connection var ansible_timeout to 10 41175 1727204671.13099: Set connection var ansible_connection to ssh 41175 1727204671.13102: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204671.13104: variable 'ansible_shell_executable' from source: unknown 41175 1727204671.13106: variable 'ansible_connection' from source: unknown 41175 1727204671.13108: variable 'ansible_module_compression' from source: unknown 41175 1727204671.13111: variable 'ansible_shell_type' from source: unknown 41175 1727204671.13113: variable 'ansible_shell_executable' from source: unknown 41175 1727204671.13115: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204671.13120: variable 'ansible_pipelining' from source: unknown 41175 1727204671.13123: variable 'ansible_timeout' from source: unknown 41175 1727204671.13126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204671.13269: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204671.13290: variable 'omit' from source: magic vars 41175 1727204671.13303: starting attempt loop 41175 1727204671.13311: running the handler 41175 1727204671.13339: variable 'ansible_facts' from source: unknown 41175 1727204671.13371: _low_level_execute_command(): starting 41175 1727204671.13386: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204671.14199: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204671.14215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204671.14243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204671.14266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204671.14321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204671.14421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204671.14449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204671.14463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204671.14546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204671.16321: stdout chunk (state=3): >>>/root <<< 41175 1727204671.16535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204671.16539: stdout chunk (state=3): >>><<< 41175 1727204671.16541: stderr chunk (state=3): >>><<< 41175 1727204671.16571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204671.16692: _low_level_execute_command(): starting 41175 1727204671.16697: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822 `" && echo ansible-tmp-1727204671.1657844-42943-112209968396822="` echo /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822 `" ) && sleep 0' 41175 1727204671.17305: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204671.17321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204671.17368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204671.17478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204671.17481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204671.17507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204671.17527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204671.17548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204671.17620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204671.19677: stdout chunk (state=3): >>>ansible-tmp-1727204671.1657844-42943-112209968396822=/root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822 <<< 41175 1727204671.19863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204671.19878: stdout chunk (state=3): >>><<< 41175 1727204671.19892: stderr chunk (state=3): >>><<< 41175 1727204671.19916: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204671.1657844-42943-112209968396822=/root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204671.20095: variable 'ansible_module_compression' from source: unknown 41175 1727204671.20099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41175 1727204671.20101: variable 'ansible_facts' from source: unknown 41175 1727204671.20305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/AnsiballZ_setup.py 41175 1727204671.20571: Sending initial data 41175 1727204671.20574: Sent initial data (154 bytes) 41175 1727204671.21174: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204671.21195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204671.21215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204671.21312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204671.21353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204671.21370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204671.21392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204671.21453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204671.23151: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204671.23220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204671.23279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpec90mffj /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/AnsiballZ_setup.py <<< 41175 1727204671.23283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/AnsiballZ_setup.py" <<< 41175 1727204671.23325: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpec90mffj" to remote "/root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/AnsiballZ_setup.py" <<< 41175 1727204671.25908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204671.25982: stderr chunk (state=3): >>><<< 41175 1727204671.25987: stdout chunk (state=3): >>><<< 41175 1727204671.26015: done transferring module to remote 41175 1727204671.26027: _low_level_execute_command(): starting 41175 1727204671.26032: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/ /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/AnsiballZ_setup.py && sleep 0' 41175 1727204671.26464: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204671.26501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204671.26504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204671.26506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204671.26510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204671.26513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204671.26571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204671.26573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204671.26608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204671.28553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204671.28605: stderr chunk (state=3): >>><<< 41175 1727204671.28608: stdout chunk (state=3): >>><<< 41175 1727204671.28620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204671.28646: _low_level_execute_command(): starting 41175 1727204671.28650: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/AnsiballZ_setup.py && sleep 0' 41175 1727204671.29072: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204671.29076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204671.29079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204671.29081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204671.29084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204671.29138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204671.29143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204671.29187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.01827: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_co<<< 41175 1727204672.01848: stdout chunk (state=3): >>>unt": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2842, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 875, "free": 2842}, "nocache": {"free": 3472, "used": 245}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1175, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148595200, "block_size": 4096, "block_total": 64479564, "block_available": 61315575, "block_used": 3163989, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["peerethtest0", "eth0", "lo", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fix<<< 41175 1727204672.01867: stdout chunk (state=3): >>>ed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7ad5:7db7:8a46:12ed", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback<<< 41175 1727204672.01878: stdout chunk (state=3): >>>": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "32", "epoch": "1727204672", "epoch_int": "1727204672", "date": "2024-09-24", "time": "15:04:32", "iso8601_micro": "2024-09-24T19:04:32.013408Z", "iso8601": "2024-09-24T19:04:32Z", "iso8601_basic": "20240924T150432013408", "iso8601_basic_short": "20240924T150432", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_hostnqn": "", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 1.0712890625, "5m": 0.89013671875, "15m": 0.544921875}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41175 1727204672.04296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204672.04301: stdout chunk (state=3): >>><<< 41175 1727204672.04304: stderr chunk (state=3): >>><<< 41175 1727204672.04314: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2842, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 875, "free": 2842}, "nocache": {"free": 3472, "used": 245}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1175, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148595200, "block_size": 4096, "block_total": 64479564, "block_available": 61315575, "block_used": 3163989, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["peerethtest0", "eth0", "lo", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7ad5:7db7:8a46:12ed", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "32", "epoch": "1727204672", "epoch_int": "1727204672", "date": "2024-09-24", "time": "15:04:32", "iso8601_micro": "2024-09-24T19:04:32.013408Z", "iso8601": "2024-09-24T19:04:32Z", "iso8601_basic": "20240924T150432013408", "iso8601_basic_short": "20240924T150432", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_hostnqn": "", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 1.0712890625, "5m": 0.89013671875, "15m": 0.544921875}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204672.04852: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204672.04873: _low_level_execute_command(): starting 41175 1727204672.04879: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204671.1657844-42943-112209968396822/ > /dev/null 2>&1 && sleep 0' 41175 1727204672.05344: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204672.05348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204672.05350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.05353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.05355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.05409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.05417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.05457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.07520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204672.07524: stdout chunk (state=3): >>><<< 41175 1727204672.07526: stderr chunk (state=3): >>><<< 41175 1727204672.07543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204672.07559: handler run complete 41175 1727204672.07896: variable 'ansible_facts' from source: unknown 41175 1727204672.07976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.08588: variable 'ansible_facts' from source: unknown 41175 1727204672.08741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.09006: attempt loop complete, returning result 41175 1727204672.09019: _execute() done 41175 1727204672.09028: dumping result to json 41175 1727204672.09084: done dumping result, returning 41175 1727204672.09101: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-f070-39c4-00000000076f] 41175 1727204672.09113: sending task result for task 12b410aa-8751-f070-39c4-00000000076f ok: [managed-node3] 41175 1727204672.10607: done sending task result for task 12b410aa-8751-f070-39c4-00000000076f 41175 1727204672.10611: WORKER PROCESS EXITING 41175 1727204672.10631: no more pending results, returning what we have 41175 1727204672.10635: results queue empty 41175 1727204672.10636: checking for any_errors_fatal 41175 1727204672.10638: done checking for any_errors_fatal 41175 1727204672.10639: checking for max_fail_percentage 41175 1727204672.10641: done checking for max_fail_percentage 41175 1727204672.10642: checking to see if all hosts have failed and the running result is not ok 41175 1727204672.10643: done checking to see if all hosts have failed 41175 1727204672.10644: getting the remaining hosts for this loop 41175 1727204672.10646: done getting the remaining hosts for this loop 41175 1727204672.10650: getting the next task for host managed-node3 41175 1727204672.10656: done getting next task for host managed-node3 41175 1727204672.10658: ^ task is: TASK: meta (flush_handlers) 41175 1727204672.10660: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.10665: getting variables 41175 1727204672.10666: in VariableManager get_vars() 41175 1727204672.10694: Calling all_inventory to load vars for managed-node3 41175 1727204672.10697: Calling groups_inventory to load vars for managed-node3 41175 1727204672.10701: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.10715: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.10719: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.10724: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.13003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.16081: done with get_vars() 41175 1727204672.16129: done getting variables 41175 1727204672.16216: in VariableManager get_vars() 41175 1727204672.16230: Calling all_inventory to load vars for managed-node3 41175 1727204672.16233: Calling groups_inventory to load vars for managed-node3 41175 1727204672.16236: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.16242: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.16246: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.16249: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.18531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.21258: done with get_vars() 41175 1727204672.21326: done queuing things up, now waiting for results queue to drain 41175 1727204672.21329: results queue empty 41175 1727204672.21330: checking for any_errors_fatal 41175 1727204672.21337: done checking for any_errors_fatal 41175 1727204672.21339: checking for max_fail_percentage 41175 1727204672.21340: done checking for max_fail_percentage 41175 1727204672.21341: checking to see if all hosts have failed and the running result is not ok 41175 1727204672.21349: done checking to see if all hosts have failed 41175 1727204672.21350: getting the remaining hosts for this loop 41175 1727204672.21352: done getting the remaining hosts for this loop 41175 1727204672.21356: getting the next task for host managed-node3 41175 1727204672.21362: done getting next task for host managed-node3 41175 1727204672.21365: ^ task is: TASK: Include the task 'delete_interface.yml' 41175 1727204672.21367: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.21371: getting variables 41175 1727204672.21372: in VariableManager get_vars() 41175 1727204672.21386: Calling all_inventory to load vars for managed-node3 41175 1727204672.21391: Calling groups_inventory to load vars for managed-node3 41175 1727204672.21395: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.21404: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.21407: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.21411: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.23519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.26528: done with get_vars() 41175 1727204672.26573: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 15:04:32 -0400 (0:00:01.157) 0:00:39.405 ***** 41175 1727204672.26672: entering _queue_task() for managed-node3/include_tasks 41175 1727204672.27068: worker is 1 (out of 1 available) 41175 1727204672.27082: exiting _queue_task() for managed-node3/include_tasks 41175 1727204672.27297: done queuing things up, now waiting for results queue to drain 41175 1727204672.27300: waiting for pending results... 41175 1727204672.27406: running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' 41175 1727204672.27537: in run() - task 12b410aa-8751-f070-39c4-0000000000cf 41175 1727204672.27561: variable 'ansible_search_path' from source: unknown 41175 1727204672.27608: calling self._execute() 41175 1727204672.27724: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204672.27745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204672.27766: variable 'omit' from source: magic vars 41175 1727204672.28239: variable 'ansible_distribution_major_version' from source: facts 41175 1727204672.28260: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204672.28272: _execute() done 41175 1727204672.28284: dumping result to json 41175 1727204672.28299: done dumping result, returning 41175 1727204672.28311: done running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' [12b410aa-8751-f070-39c4-0000000000cf] 41175 1727204672.28324: sending task result for task 12b410aa-8751-f070-39c4-0000000000cf 41175 1727204672.28573: no more pending results, returning what we have 41175 1727204672.28579: in VariableManager get_vars() 41175 1727204672.28617: Calling all_inventory to load vars for managed-node3 41175 1727204672.28621: Calling groups_inventory to load vars for managed-node3 41175 1727204672.28625: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.28643: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.28647: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.28651: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.29307: done sending task result for task 12b410aa-8751-f070-39c4-0000000000cf 41175 1727204672.29310: WORKER PROCESS EXITING 41175 1727204672.31161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.33486: done with get_vars() 41175 1727204672.33512: variable 'ansible_search_path' from source: unknown 41175 1727204672.33527: we have included files to process 41175 1727204672.33528: generating all_blocks data 41175 1727204672.33529: done generating all_blocks data 41175 1727204672.33529: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41175 1727204672.33530: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41175 1727204672.33532: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41175 1727204672.33730: done processing included file 41175 1727204672.33731: iterating over new_blocks loaded from include file 41175 1727204672.33733: in VariableManager get_vars() 41175 1727204672.33743: done with get_vars() 41175 1727204672.33744: filtering new block on tags 41175 1727204672.33757: done filtering new block on tags 41175 1727204672.33758: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node3 41175 1727204672.33763: extending task lists for all hosts with included blocks 41175 1727204672.33787: done extending task lists 41175 1727204672.33787: done processing included files 41175 1727204672.33788: results queue empty 41175 1727204672.33790: checking for any_errors_fatal 41175 1727204672.33792: done checking for any_errors_fatal 41175 1727204672.33793: checking for max_fail_percentage 41175 1727204672.33793: done checking for max_fail_percentage 41175 1727204672.33794: checking to see if all hosts have failed and the running result is not ok 41175 1727204672.33795: done checking to see if all hosts have failed 41175 1727204672.33795: getting the remaining hosts for this loop 41175 1727204672.33796: done getting the remaining hosts for this loop 41175 1727204672.33798: getting the next task for host managed-node3 41175 1727204672.33802: done getting next task for host managed-node3 41175 1727204672.33804: ^ task is: TASK: Remove test interface if necessary 41175 1727204672.33807: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.33808: getting variables 41175 1727204672.33809: in VariableManager get_vars() 41175 1727204672.33817: Calling all_inventory to load vars for managed-node3 41175 1727204672.33819: Calling groups_inventory to load vars for managed-node3 41175 1727204672.33821: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.33826: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.33828: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.33830: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.35434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.37671: done with get_vars() 41175 1727204672.37701: done getting variables 41175 1727204672.37744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:04:32 -0400 (0:00:00.110) 0:00:39.516 ***** 41175 1727204672.37771: entering _queue_task() for managed-node3/command 41175 1727204672.38040: worker is 1 (out of 1 available) 41175 1727204672.38056: exiting _queue_task() for managed-node3/command 41175 1727204672.38069: done queuing things up, now waiting for results queue to drain 41175 1727204672.38071: waiting for pending results... 41175 1727204672.38271: running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary 41175 1727204672.38361: in run() - task 12b410aa-8751-f070-39c4-000000000780 41175 1727204672.38374: variable 'ansible_search_path' from source: unknown 41175 1727204672.38378: variable 'ansible_search_path' from source: unknown 41175 1727204672.38415: calling self._execute() 41175 1727204672.38504: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204672.38516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204672.38524: variable 'omit' from source: magic vars 41175 1727204672.38929: variable 'ansible_distribution_major_version' from source: facts 41175 1727204672.38949: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204672.38962: variable 'omit' from source: magic vars 41175 1727204672.39018: variable 'omit' from source: magic vars 41175 1727204672.39165: variable 'interface' from source: set_fact 41175 1727204672.39195: variable 'omit' from source: magic vars 41175 1727204672.39255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204672.39305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204672.39336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204672.39372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204672.39472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204672.39475: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204672.39478: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204672.39481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204672.39599: Set connection var ansible_shell_executable to /bin/sh 41175 1727204672.39609: Set connection var ansible_shell_type to sh 41175 1727204672.39623: Set connection var ansible_pipelining to False 41175 1727204672.39641: Set connection var ansible_timeout to 10 41175 1727204672.39653: Set connection var ansible_connection to ssh 41175 1727204672.39665: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204672.39794: variable 'ansible_shell_executable' from source: unknown 41175 1727204672.39798: variable 'ansible_connection' from source: unknown 41175 1727204672.39801: variable 'ansible_module_compression' from source: unknown 41175 1727204672.39805: variable 'ansible_shell_type' from source: unknown 41175 1727204672.39808: variable 'ansible_shell_executable' from source: unknown 41175 1727204672.39810: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204672.39812: variable 'ansible_pipelining' from source: unknown 41175 1727204672.39816: variable 'ansible_timeout' from source: unknown 41175 1727204672.39818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204672.39953: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204672.39964: variable 'omit' from source: magic vars 41175 1727204672.39970: starting attempt loop 41175 1727204672.39973: running the handler 41175 1727204672.39988: _low_level_execute_command(): starting 41175 1727204672.39998: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204672.40541: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.40544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.40549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204672.40552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.40596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.40599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.40655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.42412: stdout chunk (state=3): >>>/root <<< 41175 1727204672.42524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204672.42574: stderr chunk (state=3): >>><<< 41175 1727204672.42579: stdout chunk (state=3): >>><<< 41175 1727204672.42604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204672.42619: _low_level_execute_command(): starting 41175 1727204672.42623: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207 `" && echo ansible-tmp-1727204672.4260447-42979-120175087490207="` echo /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207 `" ) && sleep 0' 41175 1727204672.43049: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204672.43084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204672.43087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.43100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204672.43103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204672.43105: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.43163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.43167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.43197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.45174: stdout chunk (state=3): >>>ansible-tmp-1727204672.4260447-42979-120175087490207=/root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207 <<< 41175 1727204672.45295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204672.45348: stderr chunk (state=3): >>><<< 41175 1727204672.45351: stdout chunk (state=3): >>><<< 41175 1727204672.45367: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204672.4260447-42979-120175087490207=/root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204672.45396: variable 'ansible_module_compression' from source: unknown 41175 1727204672.45446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204672.45483: variable 'ansible_facts' from source: unknown 41175 1727204672.45543: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/AnsiballZ_command.py 41175 1727204672.45661: Sending initial data 41175 1727204672.45665: Sent initial data (156 bytes) 41175 1727204672.46133: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.46136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204672.46139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.46142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.46144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.46194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.46201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.46241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.47847: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41175 1727204672.47852: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204672.47880: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204672.47914: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpnludegve /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/AnsiballZ_command.py <<< 41175 1727204672.47917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/AnsiballZ_command.py" <<< 41175 1727204672.47942: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpnludegve" to remote "/root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/AnsiballZ_command.py" <<< 41175 1727204672.48699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204672.48769: stderr chunk (state=3): >>><<< 41175 1727204672.48772: stdout chunk (state=3): >>><<< 41175 1727204672.48793: done transferring module to remote 41175 1727204672.48805: _low_level_execute_command(): starting 41175 1727204672.48811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/ /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/AnsiballZ_command.py && sleep 0' 41175 1727204672.49248: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.49297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204672.49301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204672.49303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204672.49306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204672.49311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.49353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.49360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.49400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.51245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204672.51288: stderr chunk (state=3): >>><<< 41175 1727204672.51293: stdout chunk (state=3): >>><<< 41175 1727204672.51307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204672.51311: _low_level_execute_command(): starting 41175 1727204672.51319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/AnsiballZ_command.py && sleep 0' 41175 1727204672.51753: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.51794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204672.51797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.51800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204672.51802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.51851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.51854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.51901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.70496: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:04:32.689154", "end": "2024-09-24 15:04:32.699721", "delta": "0:00:00.010567", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204672.72560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204672.72625: stderr chunk (state=3): >>><<< 41175 1727204672.72629: stdout chunk (state=3): >>><<< 41175 1727204672.72646: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:04:32.689154", "end": "2024-09-24 15:04:32.699721", "delta": "0:00:00.010567", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204672.72680: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204672.72698: _low_level_execute_command(): starting 41175 1727204672.72702: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204672.4260447-42979-120175087490207/ > /dev/null 2>&1 && sleep 0' 41175 1727204672.73176: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.73180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204672.73220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204672.73224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204672.73230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204672.73233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.73292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.73298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204672.73300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.73333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.75244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204672.75298: stderr chunk (state=3): >>><<< 41175 1727204672.75302: stdout chunk (state=3): >>><<< 41175 1727204672.75319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204672.75329: handler run complete 41175 1727204672.75351: Evaluated conditional (False): False 41175 1727204672.75362: attempt loop complete, returning result 41175 1727204672.75365: _execute() done 41175 1727204672.75369: dumping result to json 41175 1727204672.75375: done dumping result, returning 41175 1727204672.75383: done running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary [12b410aa-8751-f070-39c4-000000000780] 41175 1727204672.75392: sending task result for task 12b410aa-8751-f070-39c4-000000000780 41175 1727204672.75502: done sending task result for task 12b410aa-8751-f070-39c4-000000000780 41175 1727204672.75507: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.010567", "end": "2024-09-24 15:04:32.699721", "rc": 0, "start": "2024-09-24 15:04:32.689154" } 41175 1727204672.75586: no more pending results, returning what we have 41175 1727204672.75593: results queue empty 41175 1727204672.75594: checking for any_errors_fatal 41175 1727204672.75596: done checking for any_errors_fatal 41175 1727204672.75597: checking for max_fail_percentage 41175 1727204672.75599: done checking for max_fail_percentage 41175 1727204672.75600: checking to see if all hosts have failed and the running result is not ok 41175 1727204672.75601: done checking to see if all hosts have failed 41175 1727204672.75602: getting the remaining hosts for this loop 41175 1727204672.75604: done getting the remaining hosts for this loop 41175 1727204672.75609: getting the next task for host managed-node3 41175 1727204672.75619: done getting next task for host managed-node3 41175 1727204672.75622: ^ task is: TASK: meta (flush_handlers) 41175 1727204672.75624: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.75629: getting variables 41175 1727204672.75631: in VariableManager get_vars() 41175 1727204672.75662: Calling all_inventory to load vars for managed-node3 41175 1727204672.75665: Calling groups_inventory to load vars for managed-node3 41175 1727204672.75669: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.75681: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.75684: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.75688: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.78071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.80824: done with get_vars() 41175 1727204672.80854: done getting variables 41175 1727204672.80922: in VariableManager get_vars() 41175 1727204672.80932: Calling all_inventory to load vars for managed-node3 41175 1727204672.80934: Calling groups_inventory to load vars for managed-node3 41175 1727204672.80936: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.80940: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.80942: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.80944: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.82164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.83780: done with get_vars() 41175 1727204672.83811: done queuing things up, now waiting for results queue to drain 41175 1727204672.83813: results queue empty 41175 1727204672.83813: checking for any_errors_fatal 41175 1727204672.83818: done checking for any_errors_fatal 41175 1727204672.83819: checking for max_fail_percentage 41175 1727204672.83820: done checking for max_fail_percentage 41175 1727204672.83821: checking to see if all hosts have failed and the running result is not ok 41175 1727204672.83821: done checking to see if all hosts have failed 41175 1727204672.83822: getting the remaining hosts for this loop 41175 1727204672.83823: done getting the remaining hosts for this loop 41175 1727204672.83825: getting the next task for host managed-node3 41175 1727204672.83828: done getting next task for host managed-node3 41175 1727204672.83829: ^ task is: TASK: meta (flush_handlers) 41175 1727204672.83831: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.83833: getting variables 41175 1727204672.83834: in VariableManager get_vars() 41175 1727204672.83840: Calling all_inventory to load vars for managed-node3 41175 1727204672.83842: Calling groups_inventory to load vars for managed-node3 41175 1727204672.83844: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.83849: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.83851: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.83853: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.85011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.86597: done with get_vars() 41175 1727204672.86623: done getting variables 41175 1727204672.86665: in VariableManager get_vars() 41175 1727204672.86671: Calling all_inventory to load vars for managed-node3 41175 1727204672.86673: Calling groups_inventory to load vars for managed-node3 41175 1727204672.86679: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.86683: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.86685: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.86688: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.87783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.89397: done with get_vars() 41175 1727204672.89429: done queuing things up, now waiting for results queue to drain 41175 1727204672.89431: results queue empty 41175 1727204672.89431: checking for any_errors_fatal 41175 1727204672.89433: done checking for any_errors_fatal 41175 1727204672.89433: checking for max_fail_percentage 41175 1727204672.89434: done checking for max_fail_percentage 41175 1727204672.89435: checking to see if all hosts have failed and the running result is not ok 41175 1727204672.89435: done checking to see if all hosts have failed 41175 1727204672.89436: getting the remaining hosts for this loop 41175 1727204672.89437: done getting the remaining hosts for this loop 41175 1727204672.89439: getting the next task for host managed-node3 41175 1727204672.89442: done getting next task for host managed-node3 41175 1727204672.89442: ^ task is: None 41175 1727204672.89444: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.89445: done queuing things up, now waiting for results queue to drain 41175 1727204672.89445: results queue empty 41175 1727204672.89446: checking for any_errors_fatal 41175 1727204672.89446: done checking for any_errors_fatal 41175 1727204672.89447: checking for max_fail_percentage 41175 1727204672.89448: done checking for max_fail_percentage 41175 1727204672.89448: checking to see if all hosts have failed and the running result is not ok 41175 1727204672.89449: done checking to see if all hosts have failed 41175 1727204672.89450: getting the next task for host managed-node3 41175 1727204672.89451: done getting next task for host managed-node3 41175 1727204672.89452: ^ task is: None 41175 1727204672.89453: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.89493: in VariableManager get_vars() 41175 1727204672.89511: done with get_vars() 41175 1727204672.89518: in VariableManager get_vars() 41175 1727204672.89531: done with get_vars() 41175 1727204672.89535: variable 'omit' from source: magic vars 41175 1727204672.89643: variable 'profile' from source: play vars 41175 1727204672.89725: in VariableManager get_vars() 41175 1727204672.89738: done with get_vars() 41175 1727204672.89757: variable 'omit' from source: magic vars 41175 1727204672.89811: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 41175 1727204672.90421: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41175 1727204672.90443: getting the remaining hosts for this loop 41175 1727204672.90444: done getting the remaining hosts for this loop 41175 1727204672.90446: getting the next task for host managed-node3 41175 1727204672.90448: done getting next task for host managed-node3 41175 1727204672.90450: ^ task is: TASK: Gathering Facts 41175 1727204672.90451: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204672.90453: getting variables 41175 1727204672.90454: in VariableManager get_vars() 41175 1727204672.90462: Calling all_inventory to load vars for managed-node3 41175 1727204672.90464: Calling groups_inventory to load vars for managed-node3 41175 1727204672.90466: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204672.90470: Calling all_plugins_play to load vars for managed-node3 41175 1727204672.90472: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204672.90474: Calling groups_plugins_play to load vars for managed-node3 41175 1727204672.91713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204672.93300: done with get_vars() 41175 1727204672.93323: done getting variables 41175 1727204672.93361: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 15:04:32 -0400 (0:00:00.556) 0:00:40.072 ***** 41175 1727204672.93382: entering _queue_task() for managed-node3/gather_facts 41175 1727204672.93646: worker is 1 (out of 1 available) 41175 1727204672.93660: exiting _queue_task() for managed-node3/gather_facts 41175 1727204672.93672: done queuing things up, now waiting for results queue to drain 41175 1727204672.93674: waiting for pending results... 41175 1727204672.93868: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41175 1727204672.93956: in run() - task 12b410aa-8751-f070-39c4-00000000078e 41175 1727204672.93969: variable 'ansible_search_path' from source: unknown 41175 1727204672.94004: calling self._execute() 41175 1727204672.94092: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204672.94097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204672.94107: variable 'omit' from source: magic vars 41175 1727204672.94426: variable 'ansible_distribution_major_version' from source: facts 41175 1727204672.94437: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204672.94446: variable 'omit' from source: magic vars 41175 1727204672.94472: variable 'omit' from source: magic vars 41175 1727204672.94503: variable 'omit' from source: magic vars 41175 1727204672.94539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204672.94573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204672.94593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204672.94609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204672.94622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204672.94651: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204672.94654: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204672.94659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204672.94746: Set connection var ansible_shell_executable to /bin/sh 41175 1727204672.94750: Set connection var ansible_shell_type to sh 41175 1727204672.94756: Set connection var ansible_pipelining to False 41175 1727204672.94765: Set connection var ansible_timeout to 10 41175 1727204672.94774: Set connection var ansible_connection to ssh 41175 1727204672.94786: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204672.94804: variable 'ansible_shell_executable' from source: unknown 41175 1727204672.94807: variable 'ansible_connection' from source: unknown 41175 1727204672.94810: variable 'ansible_module_compression' from source: unknown 41175 1727204672.94813: variable 'ansible_shell_type' from source: unknown 41175 1727204672.94820: variable 'ansible_shell_executable' from source: unknown 41175 1727204672.94823: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204672.94825: variable 'ansible_pipelining' from source: unknown 41175 1727204672.94830: variable 'ansible_timeout' from source: unknown 41175 1727204672.94835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204672.94991: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204672.95006: variable 'omit' from source: magic vars 41175 1727204672.95010: starting attempt loop 41175 1727204672.95012: running the handler 41175 1727204672.95030: variable 'ansible_facts' from source: unknown 41175 1727204672.95048: _low_level_execute_command(): starting 41175 1727204672.95056: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204672.95614: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.95618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.95621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.95623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.95678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.95682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.95731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204672.97480: stdout chunk (state=3): >>>/root <<< 41175 1727204672.97581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204672.97641: stderr chunk (state=3): >>><<< 41175 1727204672.97645: stdout chunk (state=3): >>><<< 41175 1727204672.97668: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204672.97680: _low_level_execute_command(): starting 41175 1727204672.97687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579 `" && echo ansible-tmp-1727204672.9766743-42996-130776073882579="` echo /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579 `" ) && sleep 0' 41175 1727204672.98160: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204672.98163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.98166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204672.98177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204672.98180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204672.98228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204672.98233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204672.98271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204673.00257: stdout chunk (state=3): >>>ansible-tmp-1727204672.9766743-42996-130776073882579=/root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579 <<< 41175 1727204673.00379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204673.00427: stderr chunk (state=3): >>><<< 41175 1727204673.00431: stdout chunk (state=3): >>><<< 41175 1727204673.00447: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204672.9766743-42996-130776073882579=/root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204673.00478: variable 'ansible_module_compression' from source: unknown 41175 1727204673.00523: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41175 1727204673.00585: variable 'ansible_facts' from source: unknown 41175 1727204673.00708: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/AnsiballZ_setup.py 41175 1727204673.00833: Sending initial data 41175 1727204673.00836: Sent initial data (154 bytes) 41175 1727204673.01297: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204673.01300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204673.01303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204673.01305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204673.01308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204673.01363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204673.01370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204673.01406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204673.03020: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204673.03056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204673.03088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpjeopi001 /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/AnsiballZ_setup.py <<< 41175 1727204673.03098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/AnsiballZ_setup.py" <<< 41175 1727204673.03120: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpjeopi001" to remote "/root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/AnsiballZ_setup.py" <<< 41175 1727204673.03124: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/AnsiballZ_setup.py" <<< 41175 1727204673.04754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204673.04830: stderr chunk (state=3): >>><<< 41175 1727204673.04834: stdout chunk (state=3): >>><<< 41175 1727204673.04857: done transferring module to remote 41175 1727204673.04869: _low_level_execute_command(): starting 41175 1727204673.04875: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/ /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/AnsiballZ_setup.py && sleep 0' 41175 1727204673.05371: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204673.05374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204673.05377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204673.05379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204673.05439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204673.05446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204673.05448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204673.05481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204673.13603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204673.13662: stderr chunk (state=3): >>><<< 41175 1727204673.13666: stdout chunk (state=3): >>><<< 41175 1727204673.13683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204673.13686: _low_level_execute_command(): starting 41175 1727204673.13694: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/AnsiballZ_setup.py && sleep 0' 41175 1727204673.14182: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204673.14185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204673.14188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204673.14202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204673.14250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204673.14253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204673.14305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204673.82884: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "33", "epoch": "1727204673", "epoch_int": "1727204673", "date": "2024-09-24", "time": "15:04:33", "iso8601_micro": "2024-09-24T19:04:33.439983Z", "iso8601": "2024-09-24T19:04:33Z", "iso8601_basic": "20240924T150433439983", "iso8601_basic_short": "20240924T150433", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local":<<< 41175 1727204673.82922: stdout chunk (state=3): >>> "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_loadavg": {"1m": 1.0712890625, "5m": 0.89013671875, "15m": 0.544921875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2835, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 882, "free": 2835}, "nocache": {"free": 3465, "used": 252}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1177, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148578816, "block_size": 4096, "block_total": 64479564, "block_available": 61315571, "block_used": 3163993, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41175 1727204673.85075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204673.85139: stderr chunk (state=3): >>><<< 41175 1727204673.85143: stdout chunk (state=3): >>><<< 41175 1727204673.85178: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "33", "epoch": "1727204673", "epoch_int": "1727204673", "date": "2024-09-24", "time": "15:04:33", "iso8601_micro": "2024-09-24T19:04:33.439983Z", "iso8601": "2024-09-24T19:04:33Z", "iso8601_basic": "20240924T150433439983", "iso8601_basic_short": "20240924T150433", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_loadavg": {"1m": 1.0712890625, "5m": 0.89013671875, "15m": 0.544921875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2835, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 882, "free": 2835}, "nocache": {"free": 3465, "used": 252}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1177, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148578816, "block_size": 4096, "block_total": 64479564, "block_available": 61315571, "block_used": 3163993, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204673.85522: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204673.85541: _low_level_execute_command(): starting 41175 1727204673.85546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204672.9766743-42996-130776073882579/ > /dev/null 2>&1 && sleep 0' 41175 1727204673.86032: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204673.86035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204673.86038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204673.86040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204673.86095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204673.86098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204673.86143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204673.88042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204673.88092: stderr chunk (state=3): >>><<< 41175 1727204673.88095: stdout chunk (state=3): >>><<< 41175 1727204673.88111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204673.88121: handler run complete 41175 1727204673.88242: variable 'ansible_facts' from source: unknown 41175 1727204673.88342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204673.88631: variable 'ansible_facts' from source: unknown 41175 1727204673.88706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204673.88828: attempt loop complete, returning result 41175 1727204673.88833: _execute() done 41175 1727204673.88838: dumping result to json 41175 1727204673.88863: done dumping result, returning 41175 1727204673.88872: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-f070-39c4-00000000078e] 41175 1727204673.88880: sending task result for task 12b410aa-8751-f070-39c4-00000000078e ok: [managed-node3] 41175 1727204673.89547: no more pending results, returning what we have 41175 1727204673.89551: results queue empty 41175 1727204673.89552: checking for any_errors_fatal 41175 1727204673.89553: done checking for any_errors_fatal 41175 1727204673.89553: checking for max_fail_percentage 41175 1727204673.89555: done checking for max_fail_percentage 41175 1727204673.89556: checking to see if all hosts have failed and the running result is not ok 41175 1727204673.89556: done checking to see if all hosts have failed 41175 1727204673.89557: getting the remaining hosts for this loop 41175 1727204673.89558: done getting the remaining hosts for this loop 41175 1727204673.89561: getting the next task for host managed-node3 41175 1727204673.89568: done getting next task for host managed-node3 41175 1727204673.89570: ^ task is: TASK: meta (flush_handlers) 41175 1727204673.89571: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204673.89575: getting variables 41175 1727204673.89576: in VariableManager get_vars() 41175 1727204673.89601: Calling all_inventory to load vars for managed-node3 41175 1727204673.89604: Calling groups_inventory to load vars for managed-node3 41175 1727204673.89605: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204673.89615: Calling all_plugins_play to load vars for managed-node3 41175 1727204673.89620: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204673.89623: Calling groups_plugins_play to load vars for managed-node3 41175 1727204673.90141: done sending task result for task 12b410aa-8751-f070-39c4-00000000078e 41175 1727204673.90144: WORKER PROCESS EXITING 41175 1727204673.90952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204673.92596: done with get_vars() 41175 1727204673.92624: done getting variables 41175 1727204673.92679: in VariableManager get_vars() 41175 1727204673.92688: Calling all_inventory to load vars for managed-node3 41175 1727204673.92692: Calling groups_inventory to load vars for managed-node3 41175 1727204673.92694: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204673.92698: Calling all_plugins_play to load vars for managed-node3 41175 1727204673.92700: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204673.92702: Calling groups_plugins_play to load vars for managed-node3 41175 1727204673.93801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204673.95513: done with get_vars() 41175 1727204673.95545: done queuing things up, now waiting for results queue to drain 41175 1727204673.95547: results queue empty 41175 1727204673.95547: checking for any_errors_fatal 41175 1727204673.95552: done checking for any_errors_fatal 41175 1727204673.95552: checking for max_fail_percentage 41175 1727204673.95553: done checking for max_fail_percentage 41175 1727204673.95554: checking to see if all hosts have failed and the running result is not ok 41175 1727204673.95560: done checking to see if all hosts have failed 41175 1727204673.95561: getting the remaining hosts for this loop 41175 1727204673.95562: done getting the remaining hosts for this loop 41175 1727204673.95564: getting the next task for host managed-node3 41175 1727204673.95568: done getting next task for host managed-node3 41175 1727204673.95570: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204673.95571: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204673.95579: getting variables 41175 1727204673.95580: in VariableManager get_vars() 41175 1727204673.95593: Calling all_inventory to load vars for managed-node3 41175 1727204673.95595: Calling groups_inventory to load vars for managed-node3 41175 1727204673.95597: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204673.95601: Calling all_plugins_play to load vars for managed-node3 41175 1727204673.95603: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204673.95605: Calling groups_plugins_play to load vars for managed-node3 41175 1727204673.96726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204673.98337: done with get_vars() 41175 1727204673.98362: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:33 -0400 (0:00:01.050) 0:00:41.123 ***** 41175 1727204673.98431: entering _queue_task() for managed-node3/include_tasks 41175 1727204673.98709: worker is 1 (out of 1 available) 41175 1727204673.98726: exiting _queue_task() for managed-node3/include_tasks 41175 1727204673.98740: done queuing things up, now waiting for results queue to drain 41175 1727204673.98741: waiting for pending results... 41175 1727204673.98939: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41175 1727204673.99028: in run() - task 12b410aa-8751-f070-39c4-0000000000d7 41175 1727204673.99043: variable 'ansible_search_path' from source: unknown 41175 1727204673.99047: variable 'ansible_search_path' from source: unknown 41175 1727204673.99077: calling self._execute() 41175 1727204673.99171: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204673.99178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204673.99197: variable 'omit' from source: magic vars 41175 1727204673.99517: variable 'ansible_distribution_major_version' from source: facts 41175 1727204673.99532: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204673.99539: _execute() done 41175 1727204673.99543: dumping result to json 41175 1727204673.99550: done dumping result, returning 41175 1727204673.99558: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-f070-39c4-0000000000d7] 41175 1727204673.99565: sending task result for task 12b410aa-8751-f070-39c4-0000000000d7 41175 1727204673.99661: done sending task result for task 12b410aa-8751-f070-39c4-0000000000d7 41175 1727204673.99663: WORKER PROCESS EXITING 41175 1727204673.99707: no more pending results, returning what we have 41175 1727204673.99712: in VariableManager get_vars() 41175 1727204673.99756: Calling all_inventory to load vars for managed-node3 41175 1727204673.99760: Calling groups_inventory to load vars for managed-node3 41175 1727204673.99762: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204673.99777: Calling all_plugins_play to load vars for managed-node3 41175 1727204673.99780: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204673.99784: Calling groups_plugins_play to load vars for managed-node3 41175 1727204674.04412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204674.06016: done with get_vars() 41175 1727204674.06041: variable 'ansible_search_path' from source: unknown 41175 1727204674.06042: variable 'ansible_search_path' from source: unknown 41175 1727204674.06068: we have included files to process 41175 1727204674.06068: generating all_blocks data 41175 1727204674.06069: done generating all_blocks data 41175 1727204674.06070: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204674.06071: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204674.06073: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41175 1727204674.06529: done processing included file 41175 1727204674.06531: iterating over new_blocks loaded from include file 41175 1727204674.06533: in VariableManager get_vars() 41175 1727204674.06551: done with get_vars() 41175 1727204674.06553: filtering new block on tags 41175 1727204674.06565: done filtering new block on tags 41175 1727204674.06567: in VariableManager get_vars() 41175 1727204674.06581: done with get_vars() 41175 1727204674.06582: filtering new block on tags 41175 1727204674.06598: done filtering new block on tags 41175 1727204674.06600: in VariableManager get_vars() 41175 1727204674.06614: done with get_vars() 41175 1727204674.06615: filtering new block on tags 41175 1727204674.06628: done filtering new block on tags 41175 1727204674.06629: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41175 1727204674.06633: extending task lists for all hosts with included blocks 41175 1727204674.06923: done extending task lists 41175 1727204674.06925: done processing included files 41175 1727204674.06925: results queue empty 41175 1727204674.06926: checking for any_errors_fatal 41175 1727204674.06927: done checking for any_errors_fatal 41175 1727204674.06928: checking for max_fail_percentage 41175 1727204674.06928: done checking for max_fail_percentage 41175 1727204674.06929: checking to see if all hosts have failed and the running result is not ok 41175 1727204674.06930: done checking to see if all hosts have failed 41175 1727204674.06930: getting the remaining hosts for this loop 41175 1727204674.06931: done getting the remaining hosts for this loop 41175 1727204674.06933: getting the next task for host managed-node3 41175 1727204674.06936: done getting next task for host managed-node3 41175 1727204674.06938: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204674.06939: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204674.06946: getting variables 41175 1727204674.06947: in VariableManager get_vars() 41175 1727204674.06958: Calling all_inventory to load vars for managed-node3 41175 1727204674.06960: Calling groups_inventory to load vars for managed-node3 41175 1727204674.06962: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204674.06967: Calling all_plugins_play to load vars for managed-node3 41175 1727204674.06969: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204674.06971: Calling groups_plugins_play to load vars for managed-node3 41175 1727204674.08106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204674.09774: done with get_vars() 41175 1727204674.09801: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.114) 0:00:41.237 ***** 41175 1727204674.09855: entering _queue_task() for managed-node3/setup 41175 1727204674.10139: worker is 1 (out of 1 available) 41175 1727204674.10154: exiting _queue_task() for managed-node3/setup 41175 1727204674.10167: done queuing things up, now waiting for results queue to drain 41175 1727204674.10169: waiting for pending results... 41175 1727204674.10372: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41175 1727204674.10474: in run() - task 12b410aa-8751-f070-39c4-0000000007cf 41175 1727204674.10488: variable 'ansible_search_path' from source: unknown 41175 1727204674.10493: variable 'ansible_search_path' from source: unknown 41175 1727204674.10531: calling self._execute() 41175 1727204674.10618: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204674.10628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204674.10638: variable 'omit' from source: magic vars 41175 1727204674.10969: variable 'ansible_distribution_major_version' from source: facts 41175 1727204674.10980: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204674.11175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204674.12920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204674.12980: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204674.13014: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204674.13049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204674.13072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204674.13147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204674.13172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204674.13194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204674.13229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204674.13246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204674.13293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204674.13313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204674.13337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204674.13372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204674.13385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204674.13521: variable '__network_required_facts' from source: role '' defaults 41175 1727204674.13532: variable 'ansible_facts' from source: unknown 41175 1727204674.14248: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41175 1727204674.14253: when evaluation is False, skipping this task 41175 1727204674.14255: _execute() done 41175 1727204674.14258: dumping result to json 41175 1727204674.14263: done dumping result, returning 41175 1727204674.14271: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-f070-39c4-0000000007cf] 41175 1727204674.14277: sending task result for task 12b410aa-8751-f070-39c4-0000000007cf 41175 1727204674.14368: done sending task result for task 12b410aa-8751-f070-39c4-0000000007cf 41175 1727204674.14371: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204674.14422: no more pending results, returning what we have 41175 1727204674.14426: results queue empty 41175 1727204674.14427: checking for any_errors_fatal 41175 1727204674.14429: done checking for any_errors_fatal 41175 1727204674.14430: checking for max_fail_percentage 41175 1727204674.14432: done checking for max_fail_percentage 41175 1727204674.14433: checking to see if all hosts have failed and the running result is not ok 41175 1727204674.14434: done checking to see if all hosts have failed 41175 1727204674.14434: getting the remaining hosts for this loop 41175 1727204674.14436: done getting the remaining hosts for this loop 41175 1727204674.14441: getting the next task for host managed-node3 41175 1727204674.14450: done getting next task for host managed-node3 41175 1727204674.14453: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204674.14456: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204674.14472: getting variables 41175 1727204674.14473: in VariableManager get_vars() 41175 1727204674.14524: Calling all_inventory to load vars for managed-node3 41175 1727204674.14527: Calling groups_inventory to load vars for managed-node3 41175 1727204674.14530: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204674.14541: Calling all_plugins_play to load vars for managed-node3 41175 1727204674.14544: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204674.14547: Calling groups_plugins_play to load vars for managed-node3 41175 1727204674.15828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204674.17465: done with get_vars() 41175 1727204674.17490: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.077) 0:00:41.314 ***** 41175 1727204674.17572: entering _queue_task() for managed-node3/stat 41175 1727204674.17823: worker is 1 (out of 1 available) 41175 1727204674.17837: exiting _queue_task() for managed-node3/stat 41175 1727204674.17850: done queuing things up, now waiting for results queue to drain 41175 1727204674.17852: waiting for pending results... 41175 1727204674.18057: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41175 1727204674.18173: in run() - task 12b410aa-8751-f070-39c4-0000000007d1 41175 1727204674.18191: variable 'ansible_search_path' from source: unknown 41175 1727204674.18197: variable 'ansible_search_path' from source: unknown 41175 1727204674.18228: calling self._execute() 41175 1727204674.18314: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204674.18322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204674.18334: variable 'omit' from source: magic vars 41175 1727204674.18649: variable 'ansible_distribution_major_version' from source: facts 41175 1727204674.18660: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204674.18808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204674.19037: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204674.19077: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204674.19109: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204674.19140: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204674.19251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204674.19275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204674.19397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204674.19402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204674.19405: variable '__network_is_ostree' from source: set_fact 41175 1727204674.19408: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204674.19410: when evaluation is False, skipping this task 41175 1727204674.19413: _execute() done 41175 1727204674.19415: dumping result to json 41175 1727204674.19420: done dumping result, returning 41175 1727204674.19424: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-f070-39c4-0000000007d1] 41175 1727204674.19430: sending task result for task 12b410aa-8751-f070-39c4-0000000007d1 41175 1727204674.19519: done sending task result for task 12b410aa-8751-f070-39c4-0000000007d1 41175 1727204674.19523: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204674.19581: no more pending results, returning what we have 41175 1727204674.19586: results queue empty 41175 1727204674.19587: checking for any_errors_fatal 41175 1727204674.19597: done checking for any_errors_fatal 41175 1727204674.19598: checking for max_fail_percentage 41175 1727204674.19600: done checking for max_fail_percentage 41175 1727204674.19601: checking to see if all hosts have failed and the running result is not ok 41175 1727204674.19602: done checking to see if all hosts have failed 41175 1727204674.19603: getting the remaining hosts for this loop 41175 1727204674.19605: done getting the remaining hosts for this loop 41175 1727204674.19609: getting the next task for host managed-node3 41175 1727204674.19616: done getting next task for host managed-node3 41175 1727204674.19620: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204674.19623: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204674.19639: getting variables 41175 1727204674.19640: in VariableManager get_vars() 41175 1727204674.19677: Calling all_inventory to load vars for managed-node3 41175 1727204674.19680: Calling groups_inventory to load vars for managed-node3 41175 1727204674.19682: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204674.19697: Calling all_plugins_play to load vars for managed-node3 41175 1727204674.19701: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204674.19705: Calling groups_plugins_play to load vars for managed-node3 41175 1727204674.21085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204674.22690: done with get_vars() 41175 1727204674.22714: done getting variables 41175 1727204674.22765: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.052) 0:00:41.366 ***** 41175 1727204674.22795: entering _queue_task() for managed-node3/set_fact 41175 1727204674.23050: worker is 1 (out of 1 available) 41175 1727204674.23066: exiting _queue_task() for managed-node3/set_fact 41175 1727204674.23078: done queuing things up, now waiting for results queue to drain 41175 1727204674.23080: waiting for pending results... 41175 1727204674.23283: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41175 1727204674.23387: in run() - task 12b410aa-8751-f070-39c4-0000000007d2 41175 1727204674.23402: variable 'ansible_search_path' from source: unknown 41175 1727204674.23406: variable 'ansible_search_path' from source: unknown 41175 1727204674.23444: calling self._execute() 41175 1727204674.23533: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204674.23544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204674.23551: variable 'omit' from source: magic vars 41175 1727204674.23871: variable 'ansible_distribution_major_version' from source: facts 41175 1727204674.23881: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204674.24030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204674.24256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204674.24298: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204674.24330: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204674.24361: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204674.24467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204674.24491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204674.24517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204674.24542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204674.24627: variable '__network_is_ostree' from source: set_fact 41175 1727204674.24631: Evaluated conditional (not __network_is_ostree is defined): False 41175 1727204674.24636: when evaluation is False, skipping this task 41175 1727204674.24638: _execute() done 41175 1727204674.24641: dumping result to json 41175 1727204674.24643: done dumping result, returning 41175 1727204674.24647: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-f070-39c4-0000000007d2] 41175 1727204674.24653: sending task result for task 12b410aa-8751-f070-39c4-0000000007d2 41175 1727204674.24744: done sending task result for task 12b410aa-8751-f070-39c4-0000000007d2 41175 1727204674.24746: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41175 1727204674.24800: no more pending results, returning what we have 41175 1727204674.24805: results queue empty 41175 1727204674.24806: checking for any_errors_fatal 41175 1727204674.24814: done checking for any_errors_fatal 41175 1727204674.24815: checking for max_fail_percentage 41175 1727204674.24817: done checking for max_fail_percentage 41175 1727204674.24818: checking to see if all hosts have failed and the running result is not ok 41175 1727204674.24819: done checking to see if all hosts have failed 41175 1727204674.24820: getting the remaining hosts for this loop 41175 1727204674.24822: done getting the remaining hosts for this loop 41175 1727204674.24826: getting the next task for host managed-node3 41175 1727204674.24835: done getting next task for host managed-node3 41175 1727204674.24839: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204674.24842: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204674.24856: getting variables 41175 1727204674.24857: in VariableManager get_vars() 41175 1727204674.24896: Calling all_inventory to load vars for managed-node3 41175 1727204674.24900: Calling groups_inventory to load vars for managed-node3 41175 1727204674.24902: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204674.24912: Calling all_plugins_play to load vars for managed-node3 41175 1727204674.24915: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204674.24919: Calling groups_plugins_play to load vars for managed-node3 41175 1727204674.26144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204674.27853: done with get_vars() 41175 1727204674.27875: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.051) 0:00:41.418 ***** 41175 1727204674.27956: entering _queue_task() for managed-node3/service_facts 41175 1727204674.28191: worker is 1 (out of 1 available) 41175 1727204674.28207: exiting _queue_task() for managed-node3/service_facts 41175 1727204674.28218: done queuing things up, now waiting for results queue to drain 41175 1727204674.28220: waiting for pending results... 41175 1727204674.28420: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41175 1727204674.28515: in run() - task 12b410aa-8751-f070-39c4-0000000007d4 41175 1727204674.28532: variable 'ansible_search_path' from source: unknown 41175 1727204674.28536: variable 'ansible_search_path' from source: unknown 41175 1727204674.28570: calling self._execute() 41175 1727204674.28661: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204674.28665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204674.28679: variable 'omit' from source: magic vars 41175 1727204674.29004: variable 'ansible_distribution_major_version' from source: facts 41175 1727204674.29016: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204674.29026: variable 'omit' from source: magic vars 41175 1727204674.29071: variable 'omit' from source: magic vars 41175 1727204674.29106: variable 'omit' from source: magic vars 41175 1727204674.29144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204674.29174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204674.29193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204674.29213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204674.29230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204674.29256: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204674.29259: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204674.29264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204674.29355: Set connection var ansible_shell_executable to /bin/sh 41175 1727204674.29359: Set connection var ansible_shell_type to sh 41175 1727204674.29365: Set connection var ansible_pipelining to False 41175 1727204674.29374: Set connection var ansible_timeout to 10 41175 1727204674.29380: Set connection var ansible_connection to ssh 41175 1727204674.29386: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204674.29407: variable 'ansible_shell_executable' from source: unknown 41175 1727204674.29410: variable 'ansible_connection' from source: unknown 41175 1727204674.29415: variable 'ansible_module_compression' from source: unknown 41175 1727204674.29417: variable 'ansible_shell_type' from source: unknown 41175 1727204674.29425: variable 'ansible_shell_executable' from source: unknown 41175 1727204674.29428: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204674.29433: variable 'ansible_pipelining' from source: unknown 41175 1727204674.29435: variable 'ansible_timeout' from source: unknown 41175 1727204674.29445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204674.29614: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204674.29627: variable 'omit' from source: magic vars 41175 1727204674.29632: starting attempt loop 41175 1727204674.29637: running the handler 41175 1727204674.29652: _low_level_execute_command(): starting 41175 1727204674.29664: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204674.30220: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204674.30224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.30228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204674.30230: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.30284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204674.30287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204674.30293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204674.30334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204674.32086: stdout chunk (state=3): >>>/root <<< 41175 1727204674.32200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204674.32254: stderr chunk (state=3): >>><<< 41175 1727204674.32257: stdout chunk (state=3): >>><<< 41175 1727204674.32280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204674.32291: _low_level_execute_command(): starting 41175 1727204674.32299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703 `" && echo ansible-tmp-1727204674.322769-43015-167177843318703="` echo /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703 `" ) && sleep 0' 41175 1727204674.32763: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204674.32766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.32769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204674.32778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.32830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204674.32838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204674.32874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204674.34854: stdout chunk (state=3): >>>ansible-tmp-1727204674.322769-43015-167177843318703=/root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703 <<< 41175 1727204674.34971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204674.35027: stderr chunk (state=3): >>><<< 41175 1727204674.35030: stdout chunk (state=3): >>><<< 41175 1727204674.35046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204674.322769-43015-167177843318703=/root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204674.35085: variable 'ansible_module_compression' from source: unknown 41175 1727204674.35131: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41175 1727204674.35169: variable 'ansible_facts' from source: unknown 41175 1727204674.35239: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/AnsiballZ_service_facts.py 41175 1727204674.35353: Sending initial data 41175 1727204674.35357: Sent initial data (161 bytes) 41175 1727204674.35829: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204674.35832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204674.35835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204674.35838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204674.35840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.35896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204674.35903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204674.35939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204674.37545: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41175 1727204674.37550: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204674.37578: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204674.37617: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp9n0cbr94 /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/AnsiballZ_service_facts.py <<< 41175 1727204674.37620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/AnsiballZ_service_facts.py" <<< 41175 1727204674.37647: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp9n0cbr94" to remote "/root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/AnsiballZ_service_facts.py" <<< 41175 1727204674.38429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204674.38501: stderr chunk (state=3): >>><<< 41175 1727204674.38505: stdout chunk (state=3): >>><<< 41175 1727204674.38526: done transferring module to remote 41175 1727204674.38536: _low_level_execute_command(): starting 41175 1727204674.38542: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/ /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/AnsiballZ_service_facts.py && sleep 0' 41175 1727204674.39013: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204674.39019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.39022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204674.39025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204674.39027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.39074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204674.39084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204674.39117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204674.40959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204674.41015: stderr chunk (state=3): >>><<< 41175 1727204674.41021: stdout chunk (state=3): >>><<< 41175 1727204674.41033: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204674.41036: _low_level_execute_command(): starting 41175 1727204674.41042: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/AnsiballZ_service_facts.py && sleep 0' 41175 1727204674.41514: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204674.41517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.41522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204674.41525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204674.41527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204674.41578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204674.41582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204674.41658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204676.39821: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state":<<< 41175 1727204676.39875: stdout chunk (state=3): >>> "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status<<< 41175 1727204676.39908: stdout chunk (state=3): >>>": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41175 1727204676.41586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204676.41592: stdout chunk (state=3): >>><<< 41175 1727204676.41595: stderr chunk (state=3): >>><<< 41175 1727204676.41800: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204676.42973: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204676.42995: _low_level_execute_command(): starting 41175 1727204676.43007: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204674.322769-43015-167177843318703/ > /dev/null 2>&1 && sleep 0' 41175 1727204676.43803: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204676.43903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204676.43932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204676.43952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204676.44030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204676.46031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204676.46035: stdout chunk (state=3): >>><<< 41175 1727204676.46037: stderr chunk (state=3): >>><<< 41175 1727204676.46196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204676.46199: handler run complete 41175 1727204676.46403: variable 'ansible_facts' from source: unknown 41175 1727204676.46673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204676.47581: variable 'ansible_facts' from source: unknown 41175 1727204676.47825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204676.48245: attempt loop complete, returning result 41175 1727204676.48259: _execute() done 41175 1727204676.48269: dumping result to json 41175 1727204676.48370: done dumping result, returning 41175 1727204676.48398: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-f070-39c4-0000000007d4] 41175 1727204676.48414: sending task result for task 12b410aa-8751-f070-39c4-0000000007d4 41175 1727204676.50045: done sending task result for task 12b410aa-8751-f070-39c4-0000000007d4 41175 1727204676.50051: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204676.50213: no more pending results, returning what we have 41175 1727204676.50220: results queue empty 41175 1727204676.50221: checking for any_errors_fatal 41175 1727204676.50226: done checking for any_errors_fatal 41175 1727204676.50227: checking for max_fail_percentage 41175 1727204676.50231: done checking for max_fail_percentage 41175 1727204676.50232: checking to see if all hosts have failed and the running result is not ok 41175 1727204676.50233: done checking to see if all hosts have failed 41175 1727204676.50234: getting the remaining hosts for this loop 41175 1727204676.50235: done getting the remaining hosts for this loop 41175 1727204676.50239: getting the next task for host managed-node3 41175 1727204676.50246: done getting next task for host managed-node3 41175 1727204676.50250: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204676.50253: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204676.50264: getting variables 41175 1727204676.50265: in VariableManager get_vars() 41175 1727204676.50311: Calling all_inventory to load vars for managed-node3 41175 1727204676.50315: Calling groups_inventory to load vars for managed-node3 41175 1727204676.50321: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204676.50333: Calling all_plugins_play to load vars for managed-node3 41175 1727204676.50337: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204676.50341: Calling groups_plugins_play to load vars for managed-node3 41175 1727204676.52872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204676.55995: done with get_vars() 41175 1727204676.56035: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:36 -0400 (0:00:02.281) 0:00:43.700 ***** 41175 1727204676.56120: entering _queue_task() for managed-node3/package_facts 41175 1727204676.56395: worker is 1 (out of 1 available) 41175 1727204676.56411: exiting _queue_task() for managed-node3/package_facts 41175 1727204676.56427: done queuing things up, now waiting for results queue to drain 41175 1727204676.56429: waiting for pending results... 41175 1727204676.56632: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41175 1727204676.56730: in run() - task 12b410aa-8751-f070-39c4-0000000007d5 41175 1727204676.56744: variable 'ansible_search_path' from source: unknown 41175 1727204676.56749: variable 'ansible_search_path' from source: unknown 41175 1727204676.56784: calling self._execute() 41175 1727204676.56872: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204676.56876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204676.56892: variable 'omit' from source: magic vars 41175 1727204676.57221: variable 'ansible_distribution_major_version' from source: facts 41175 1727204676.57230: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204676.57237: variable 'omit' from source: magic vars 41175 1727204676.57283: variable 'omit' from source: magic vars 41175 1727204676.57314: variable 'omit' from source: magic vars 41175 1727204676.57352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204676.57382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204676.57402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204676.57420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204676.57434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204676.57476: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204676.57481: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204676.57484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204676.57658: Set connection var ansible_shell_executable to /bin/sh 41175 1727204676.57662: Set connection var ansible_shell_type to sh 41175 1727204676.57667: Set connection var ansible_pipelining to False 41175 1727204676.57670: Set connection var ansible_timeout to 10 41175 1727204676.57673: Set connection var ansible_connection to ssh 41175 1727204676.57676: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204676.57679: variable 'ansible_shell_executable' from source: unknown 41175 1727204676.57682: variable 'ansible_connection' from source: unknown 41175 1727204676.57684: variable 'ansible_module_compression' from source: unknown 41175 1727204676.57687: variable 'ansible_shell_type' from source: unknown 41175 1727204676.57691: variable 'ansible_shell_executable' from source: unknown 41175 1727204676.57693: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204676.57756: variable 'ansible_pipelining' from source: unknown 41175 1727204676.57759: variable 'ansible_timeout' from source: unknown 41175 1727204676.57762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204676.58113: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204676.58117: variable 'omit' from source: magic vars 41175 1727204676.58120: starting attempt loop 41175 1727204676.58122: running the handler 41175 1727204676.58125: _low_level_execute_command(): starting 41175 1727204676.58127: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204676.58817: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204676.58832: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204676.58875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204676.58883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204676.58925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204676.60666: stdout chunk (state=3): >>>/root <<< 41175 1727204676.60806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204676.60823: stderr chunk (state=3): >>><<< 41175 1727204676.60827: stdout chunk (state=3): >>><<< 41175 1727204676.60846: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204676.60858: _low_level_execute_command(): starting 41175 1727204676.60865: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494 `" && echo ansible-tmp-1727204676.6084619-43059-122654114765494="` echo /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494 `" ) && sleep 0' 41175 1727204676.61533: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204676.61555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204676.61558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204676.61601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204676.61640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204676.63660: stdout chunk (state=3): >>>ansible-tmp-1727204676.6084619-43059-122654114765494=/root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494 <<< 41175 1727204676.63883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204676.63887: stdout chunk (state=3): >>><<< 41175 1727204676.63891: stderr chunk (state=3): >>><<< 41175 1727204676.64096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204676.6084619-43059-122654114765494=/root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204676.64100: variable 'ansible_module_compression' from source: unknown 41175 1727204676.64103: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41175 1727204676.64134: variable 'ansible_facts' from source: unknown 41175 1727204676.64347: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/AnsiballZ_package_facts.py 41175 1727204676.64569: Sending initial data 41175 1727204676.64582: Sent initial data (162 bytes) 41175 1727204676.65284: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204676.65323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204676.65353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204676.65408: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204676.65486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204676.65510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204676.65551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204676.65624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204676.67294: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204676.67339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204676.67386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmps7jzepf3 /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/AnsiballZ_package_facts.py <<< 41175 1727204676.67438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/AnsiballZ_package_facts.py" <<< 41175 1727204676.67457: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmps7jzepf3" to remote "/root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/AnsiballZ_package_facts.py" <<< 41175 1727204676.69927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204676.69967: stderr chunk (state=3): >>><<< 41175 1727204676.69981: stdout chunk (state=3): >>><<< 41175 1727204676.70030: done transferring module to remote 41175 1727204676.70053: _low_level_execute_command(): starting 41175 1727204676.70064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/ /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/AnsiballZ_package_facts.py && sleep 0' 41175 1727204676.70816: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204676.70835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204676.70905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204676.70996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204676.71060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204676.71107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204676.73111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204676.73122: stdout chunk (state=3): >>><<< 41175 1727204676.73124: stderr chunk (state=3): >>><<< 41175 1727204676.73239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204676.73243: _low_level_execute_command(): starting 41175 1727204676.73247: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/AnsiballZ_package_facts.py && sleep 0' 41175 1727204676.73833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204676.73853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204676.73868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204676.73893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204676.73958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204676.74030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204676.74067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204676.74086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204676.74169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204677.38107: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 41175 1727204677.38140: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 41175 1727204677.45185: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 41175 1727204677.45192: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 41175 1727204677.45198: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 41175 1727204677.45201: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 41175 1727204677.45210: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 41175 1727204677.45214: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 41175 1727204677.45218: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 41175 1727204677.45221: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 41175 1727204677.45226: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 41175 1727204677.45318: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 41175 1727204677.45326: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41175 1727204677.45339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204677.45342: stdout chunk (state=3): >>><<< 41175 1727204677.45344: stderr chunk (state=3): >>><<< 41175 1727204677.45366: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204677.49648: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204677.49653: _low_level_execute_command(): starting 41175 1727204677.49655: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204676.6084619-43059-122654114765494/ > /dev/null 2>&1 && sleep 0' 41175 1727204677.50396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204677.50400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204677.50420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204677.50478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204677.50500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204677.50528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204677.50604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204677.52657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204677.52661: stdout chunk (state=3): >>><<< 41175 1727204677.52667: stderr chunk (state=3): >>><<< 41175 1727204677.52685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204677.52693: handler run complete 41175 1727204677.54472: variable 'ansible_facts' from source: unknown 41175 1727204677.55298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204677.59308: variable 'ansible_facts' from source: unknown 41175 1727204677.60267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204677.61905: attempt loop complete, returning result 41175 1727204677.61930: _execute() done 41175 1727204677.61933: dumping result to json 41175 1727204677.62300: done dumping result, returning 41175 1727204677.62315: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-f070-39c4-0000000007d5] 41175 1727204677.62322: sending task result for task 12b410aa-8751-f070-39c4-0000000007d5 41175 1727204677.66362: done sending task result for task 12b410aa-8751-f070-39c4-0000000007d5 41175 1727204677.66366: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204677.66535: no more pending results, returning what we have 41175 1727204677.66539: results queue empty 41175 1727204677.66540: checking for any_errors_fatal 41175 1727204677.66547: done checking for any_errors_fatal 41175 1727204677.66548: checking for max_fail_percentage 41175 1727204677.66550: done checking for max_fail_percentage 41175 1727204677.66551: checking to see if all hosts have failed and the running result is not ok 41175 1727204677.66552: done checking to see if all hosts have failed 41175 1727204677.66553: getting the remaining hosts for this loop 41175 1727204677.66554: done getting the remaining hosts for this loop 41175 1727204677.66558: getting the next task for host managed-node3 41175 1727204677.66566: done getting next task for host managed-node3 41175 1727204677.66569: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204677.66572: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204677.66583: getting variables 41175 1727204677.66584: in VariableManager get_vars() 41175 1727204677.66624: Calling all_inventory to load vars for managed-node3 41175 1727204677.66627: Calling groups_inventory to load vars for managed-node3 41175 1727204677.66630: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204677.66641: Calling all_plugins_play to load vars for managed-node3 41175 1727204677.66644: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204677.66648: Calling groups_plugins_play to load vars for managed-node3 41175 1727204677.68609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204677.71671: done with get_vars() 41175 1727204677.71723: done getting variables 41175 1727204677.71804: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:37 -0400 (0:00:01.157) 0:00:44.857 ***** 41175 1727204677.71841: entering _queue_task() for managed-node3/debug 41175 1727204677.72225: worker is 1 (out of 1 available) 41175 1727204677.72239: exiting _queue_task() for managed-node3/debug 41175 1727204677.72253: done queuing things up, now waiting for results queue to drain 41175 1727204677.72254: waiting for pending results... 41175 1727204677.72709: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41175 1727204677.72714: in run() - task 12b410aa-8751-f070-39c4-0000000000d8 41175 1727204677.72729: variable 'ansible_search_path' from source: unknown 41175 1727204677.72737: variable 'ansible_search_path' from source: unknown 41175 1727204677.72783: calling self._execute() 41175 1727204677.72904: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204677.72918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204677.72941: variable 'omit' from source: magic vars 41175 1727204677.73394: variable 'ansible_distribution_major_version' from source: facts 41175 1727204677.73413: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204677.73426: variable 'omit' from source: magic vars 41175 1727204677.73478: variable 'omit' from source: magic vars 41175 1727204677.73614: variable 'network_provider' from source: set_fact 41175 1727204677.73642: variable 'omit' from source: magic vars 41175 1727204677.73707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204677.73743: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204677.73816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204677.73819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204677.73822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204677.73854: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204677.73863: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204677.73872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204677.74010: Set connection var ansible_shell_executable to /bin/sh 41175 1727204677.74019: Set connection var ansible_shell_type to sh 41175 1727204677.74036: Set connection var ansible_pipelining to False 41175 1727204677.74094: Set connection var ansible_timeout to 10 41175 1727204677.74099: Set connection var ansible_connection to ssh 41175 1727204677.74101: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204677.74103: variable 'ansible_shell_executable' from source: unknown 41175 1727204677.74109: variable 'ansible_connection' from source: unknown 41175 1727204677.74116: variable 'ansible_module_compression' from source: unknown 41175 1727204677.74124: variable 'ansible_shell_type' from source: unknown 41175 1727204677.74131: variable 'ansible_shell_executable' from source: unknown 41175 1727204677.74143: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204677.74152: variable 'ansible_pipelining' from source: unknown 41175 1727204677.74159: variable 'ansible_timeout' from source: unknown 41175 1727204677.74167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204677.74359: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204677.74363: variable 'omit' from source: magic vars 41175 1727204677.74368: starting attempt loop 41175 1727204677.74376: running the handler 41175 1727204677.74433: handler run complete 41175 1727204677.74466: attempt loop complete, returning result 41175 1727204677.74469: _execute() done 41175 1727204677.74494: dumping result to json 41175 1727204677.74497: done dumping result, returning 41175 1727204677.74502: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-f070-39c4-0000000000d8] 41175 1727204677.74510: sending task result for task 12b410aa-8751-f070-39c4-0000000000d8 41175 1727204677.74646: done sending task result for task 12b410aa-8751-f070-39c4-0000000000d8 41175 1727204677.74650: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 41175 1727204677.74749: no more pending results, returning what we have 41175 1727204677.74754: results queue empty 41175 1727204677.74755: checking for any_errors_fatal 41175 1727204677.74767: done checking for any_errors_fatal 41175 1727204677.74768: checking for max_fail_percentage 41175 1727204677.74770: done checking for max_fail_percentage 41175 1727204677.74771: checking to see if all hosts have failed and the running result is not ok 41175 1727204677.74773: done checking to see if all hosts have failed 41175 1727204677.74774: getting the remaining hosts for this loop 41175 1727204677.74776: done getting the remaining hosts for this loop 41175 1727204677.74781: getting the next task for host managed-node3 41175 1727204677.74788: done getting next task for host managed-node3 41175 1727204677.74794: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204677.74797: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204677.74809: getting variables 41175 1727204677.74811: in VariableManager get_vars() 41175 1727204677.74853: Calling all_inventory to load vars for managed-node3 41175 1727204677.74856: Calling groups_inventory to load vars for managed-node3 41175 1727204677.74859: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204677.74873: Calling all_plugins_play to load vars for managed-node3 41175 1727204677.74877: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204677.74881: Calling groups_plugins_play to load vars for managed-node3 41175 1727204677.77461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204677.81093: done with get_vars() 41175 1727204677.81136: done getting variables 41175 1727204677.81214: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.094) 0:00:44.951 ***** 41175 1727204677.81259: entering _queue_task() for managed-node3/fail 41175 1727204677.81648: worker is 1 (out of 1 available) 41175 1727204677.81663: exiting _queue_task() for managed-node3/fail 41175 1727204677.81675: done queuing things up, now waiting for results queue to drain 41175 1727204677.81677: waiting for pending results... 41175 1727204677.82108: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41175 1727204677.82143: in run() - task 12b410aa-8751-f070-39c4-0000000000d9 41175 1727204677.82161: variable 'ansible_search_path' from source: unknown 41175 1727204677.82164: variable 'ansible_search_path' from source: unknown 41175 1727204677.82206: calling self._execute() 41175 1727204677.82495: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204677.82499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204677.82503: variable 'omit' from source: magic vars 41175 1727204677.82826: variable 'ansible_distribution_major_version' from source: facts 41175 1727204677.82839: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204677.83002: variable 'network_state' from source: role '' defaults 41175 1727204677.83028: Evaluated conditional (network_state != {}): False 41175 1727204677.83032: when evaluation is False, skipping this task 41175 1727204677.83035: _execute() done 41175 1727204677.83038: dumping result to json 41175 1727204677.83044: done dumping result, returning 41175 1727204677.83052: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-f070-39c4-0000000000d9] 41175 1727204677.83060: sending task result for task 12b410aa-8751-f070-39c4-0000000000d9 41175 1727204677.83163: done sending task result for task 12b410aa-8751-f070-39c4-0000000000d9 41175 1727204677.83167: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204677.83226: no more pending results, returning what we have 41175 1727204677.83232: results queue empty 41175 1727204677.83235: checking for any_errors_fatal 41175 1727204677.83244: done checking for any_errors_fatal 41175 1727204677.83245: checking for max_fail_percentage 41175 1727204677.83247: done checking for max_fail_percentage 41175 1727204677.83248: checking to see if all hosts have failed and the running result is not ok 41175 1727204677.83249: done checking to see if all hosts have failed 41175 1727204677.83250: getting the remaining hosts for this loop 41175 1727204677.83252: done getting the remaining hosts for this loop 41175 1727204677.83257: getting the next task for host managed-node3 41175 1727204677.83265: done getting next task for host managed-node3 41175 1727204677.83269: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204677.83272: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204677.83291: getting variables 41175 1727204677.83293: in VariableManager get_vars() 41175 1727204677.83338: Calling all_inventory to load vars for managed-node3 41175 1727204677.83341: Calling groups_inventory to load vars for managed-node3 41175 1727204677.83344: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204677.83360: Calling all_plugins_play to load vars for managed-node3 41175 1727204677.83363: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204677.83367: Calling groups_plugins_play to load vars for managed-node3 41175 1727204677.85884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204677.89049: done with get_vars() 41175 1727204677.89095: done getting variables 41175 1727204677.89170: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.079) 0:00:45.030 ***** 41175 1727204677.89210: entering _queue_task() for managed-node3/fail 41175 1727204677.89567: worker is 1 (out of 1 available) 41175 1727204677.89583: exiting _queue_task() for managed-node3/fail 41175 1727204677.89800: done queuing things up, now waiting for results queue to drain 41175 1727204677.89802: waiting for pending results... 41175 1727204677.89917: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41175 1727204677.90197: in run() - task 12b410aa-8751-f070-39c4-0000000000da 41175 1727204677.90203: variable 'ansible_search_path' from source: unknown 41175 1727204677.90206: variable 'ansible_search_path' from source: unknown 41175 1727204677.90209: calling self._execute() 41175 1727204677.90239: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204677.90247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204677.90266: variable 'omit' from source: magic vars 41175 1727204677.90745: variable 'ansible_distribution_major_version' from source: facts 41175 1727204677.90761: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204677.90938: variable 'network_state' from source: role '' defaults 41175 1727204677.90954: Evaluated conditional (network_state != {}): False 41175 1727204677.90957: when evaluation is False, skipping this task 41175 1727204677.90960: _execute() done 41175 1727204677.90965: dumping result to json 41175 1727204677.90970: done dumping result, returning 41175 1727204677.90980: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-f070-39c4-0000000000da] 41175 1727204677.90986: sending task result for task 12b410aa-8751-f070-39c4-0000000000da 41175 1727204677.91099: done sending task result for task 12b410aa-8751-f070-39c4-0000000000da 41175 1727204677.91102: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204677.91174: no more pending results, returning what we have 41175 1727204677.91180: results queue empty 41175 1727204677.91181: checking for any_errors_fatal 41175 1727204677.91191: done checking for any_errors_fatal 41175 1727204677.91192: checking for max_fail_percentage 41175 1727204677.91194: done checking for max_fail_percentage 41175 1727204677.91195: checking to see if all hosts have failed and the running result is not ok 41175 1727204677.91196: done checking to see if all hosts have failed 41175 1727204677.91197: getting the remaining hosts for this loop 41175 1727204677.91200: done getting the remaining hosts for this loop 41175 1727204677.91204: getting the next task for host managed-node3 41175 1727204677.91296: done getting next task for host managed-node3 41175 1727204677.91301: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204677.91304: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204677.91326: getting variables 41175 1727204677.91328: in VariableManager get_vars() 41175 1727204677.91369: Calling all_inventory to load vars for managed-node3 41175 1727204677.91372: Calling groups_inventory to load vars for managed-node3 41175 1727204677.91375: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204677.91388: Calling all_plugins_play to load vars for managed-node3 41175 1727204677.91436: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204677.91441: Calling groups_plugins_play to load vars for managed-node3 41175 1727204677.94074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204677.99173: done with get_vars() 41175 1727204677.99335: done getting variables 41175 1727204677.99410: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.102) 0:00:45.133 ***** 41175 1727204677.99455: entering _queue_task() for managed-node3/fail 41175 1727204678.00396: worker is 1 (out of 1 available) 41175 1727204678.00412: exiting _queue_task() for managed-node3/fail 41175 1727204678.00427: done queuing things up, now waiting for results queue to drain 41175 1727204678.00429: waiting for pending results... 41175 1727204678.00943: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41175 1727204678.01231: in run() - task 12b410aa-8751-f070-39c4-0000000000db 41175 1727204678.01236: variable 'ansible_search_path' from source: unknown 41175 1727204678.01239: variable 'ansible_search_path' from source: unknown 41175 1727204678.01593: calling self._execute() 41175 1727204678.01705: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.01809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.01829: variable 'omit' from source: magic vars 41175 1727204678.02856: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.02914: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.03556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204678.06707: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204678.06795: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204678.06850: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204678.06899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204678.06940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204678.07043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.07086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.07126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.07188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.07213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.07341: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.07370: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41175 1727204678.07567: variable 'ansible_distribution' from source: facts 41175 1727204678.07571: variable '__network_rh_distros' from source: role '' defaults 41175 1727204678.07573: Evaluated conditional (ansible_distribution in __network_rh_distros): False 41175 1727204678.07576: when evaluation is False, skipping this task 41175 1727204678.07578: _execute() done 41175 1727204678.07580: dumping result to json 41175 1727204678.07587: done dumping result, returning 41175 1727204678.07602: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-f070-39c4-0000000000db] 41175 1727204678.07613: sending task result for task 12b410aa-8751-f070-39c4-0000000000db 41175 1727204678.07844: done sending task result for task 12b410aa-8751-f070-39c4-0000000000db 41175 1727204678.07848: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 41175 1727204678.08111: no more pending results, returning what we have 41175 1727204678.08115: results queue empty 41175 1727204678.08119: checking for any_errors_fatal 41175 1727204678.08127: done checking for any_errors_fatal 41175 1727204678.08128: checking for max_fail_percentage 41175 1727204678.08130: done checking for max_fail_percentage 41175 1727204678.08131: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.08132: done checking to see if all hosts have failed 41175 1727204678.08133: getting the remaining hosts for this loop 41175 1727204678.08135: done getting the remaining hosts for this loop 41175 1727204678.08139: getting the next task for host managed-node3 41175 1727204678.08146: done getting next task for host managed-node3 41175 1727204678.08151: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204678.08153: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.08168: getting variables 41175 1727204678.08169: in VariableManager get_vars() 41175 1727204678.08219: Calling all_inventory to load vars for managed-node3 41175 1727204678.08222: Calling groups_inventory to load vars for managed-node3 41175 1727204678.08226: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.08237: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.08240: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.08244: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.10638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.14053: done with get_vars() 41175 1727204678.14094: done getting variables 41175 1727204678.14171: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.147) 0:00:45.280 ***** 41175 1727204678.14210: entering _queue_task() for managed-node3/dnf 41175 1727204678.14688: worker is 1 (out of 1 available) 41175 1727204678.14708: exiting _queue_task() for managed-node3/dnf 41175 1727204678.14725: done queuing things up, now waiting for results queue to drain 41175 1727204678.14727: waiting for pending results... 41175 1727204678.15019: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41175 1727204678.15160: in run() - task 12b410aa-8751-f070-39c4-0000000000dc 41175 1727204678.15165: variable 'ansible_search_path' from source: unknown 41175 1727204678.15168: variable 'ansible_search_path' from source: unknown 41175 1727204678.15269: calling self._execute() 41175 1727204678.15334: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.15399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.15416: variable 'omit' from source: magic vars 41175 1727204678.15883: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.16010: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.16206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204678.18152: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204678.18208: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204678.18269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204678.18296: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204678.18363: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204678.18442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.18498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.18557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.18610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.18797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.18800: variable 'ansible_distribution' from source: facts 41175 1727204678.18803: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.18815: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41175 1727204678.18987: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204678.19210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.19232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.19260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.19294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.19312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.19347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.19373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.19394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.19426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.19439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.19479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.19500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.19522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.19552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.19565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.19701: variable 'network_connections' from source: play vars 41175 1727204678.19712: variable 'profile' from source: play vars 41175 1727204678.19771: variable 'profile' from source: play vars 41175 1727204678.19774: variable 'interface' from source: set_fact 41175 1727204678.19832: variable 'interface' from source: set_fact 41175 1727204678.19894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204678.20049: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204678.20082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204678.20109: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204678.20139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204678.20176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204678.20196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204678.20224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.20248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204678.20294: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204678.20499: variable 'network_connections' from source: play vars 41175 1727204678.20505: variable 'profile' from source: play vars 41175 1727204678.20558: variable 'profile' from source: play vars 41175 1727204678.20564: variable 'interface' from source: set_fact 41175 1727204678.20619: variable 'interface' from source: set_fact 41175 1727204678.20639: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204678.20642: when evaluation is False, skipping this task 41175 1727204678.20645: _execute() done 41175 1727204678.20650: dumping result to json 41175 1727204678.20655: done dumping result, returning 41175 1727204678.20666: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000dc] 41175 1727204678.20669: sending task result for task 12b410aa-8751-f070-39c4-0000000000dc 41175 1727204678.20772: done sending task result for task 12b410aa-8751-f070-39c4-0000000000dc 41175 1727204678.20775: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204678.20840: no more pending results, returning what we have 41175 1727204678.20845: results queue empty 41175 1727204678.20846: checking for any_errors_fatal 41175 1727204678.20855: done checking for any_errors_fatal 41175 1727204678.20856: checking for max_fail_percentage 41175 1727204678.20857: done checking for max_fail_percentage 41175 1727204678.20858: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.20859: done checking to see if all hosts have failed 41175 1727204678.20860: getting the remaining hosts for this loop 41175 1727204678.20863: done getting the remaining hosts for this loop 41175 1727204678.20868: getting the next task for host managed-node3 41175 1727204678.20875: done getting next task for host managed-node3 41175 1727204678.20881: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204678.20885: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.20903: getting variables 41175 1727204678.20905: in VariableManager get_vars() 41175 1727204678.20950: Calling all_inventory to load vars for managed-node3 41175 1727204678.20953: Calling groups_inventory to load vars for managed-node3 41175 1727204678.20956: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.20967: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.20970: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.20973: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.23111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.26180: done with get_vars() 41175 1727204678.26241: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41175 1727204678.26340: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.121) 0:00:45.402 ***** 41175 1727204678.26377: entering _queue_task() for managed-node3/yum 41175 1727204678.26771: worker is 1 (out of 1 available) 41175 1727204678.26785: exiting _queue_task() for managed-node3/yum 41175 1727204678.27001: done queuing things up, now waiting for results queue to drain 41175 1727204678.27003: waiting for pending results... 41175 1727204678.27212: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41175 1727204678.27294: in run() - task 12b410aa-8751-f070-39c4-0000000000dd 41175 1727204678.27322: variable 'ansible_search_path' from source: unknown 41175 1727204678.27336: variable 'ansible_search_path' from source: unknown 41175 1727204678.27387: calling self._execute() 41175 1727204678.27520: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.27537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.27671: variable 'omit' from source: magic vars 41175 1727204678.28041: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.28063: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.28312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204678.31065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204678.31161: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204678.31212: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204678.31266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204678.31305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204678.31412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.31457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.31502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.31563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.31594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.31726: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.31750: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41175 1727204678.31760: when evaluation is False, skipping this task 41175 1727204678.31767: _execute() done 41175 1727204678.31895: dumping result to json 41175 1727204678.31898: done dumping result, returning 41175 1727204678.31902: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000dd] 41175 1727204678.31904: sending task result for task 12b410aa-8751-f070-39c4-0000000000dd 41175 1727204678.31986: done sending task result for task 12b410aa-8751-f070-39c4-0000000000dd 41175 1727204678.31992: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41175 1727204678.32056: no more pending results, returning what we have 41175 1727204678.32061: results queue empty 41175 1727204678.32062: checking for any_errors_fatal 41175 1727204678.32072: done checking for any_errors_fatal 41175 1727204678.32073: checking for max_fail_percentage 41175 1727204678.32075: done checking for max_fail_percentage 41175 1727204678.32076: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.32078: done checking to see if all hosts have failed 41175 1727204678.32078: getting the remaining hosts for this loop 41175 1727204678.32081: done getting the remaining hosts for this loop 41175 1727204678.32086: getting the next task for host managed-node3 41175 1727204678.32096: done getting next task for host managed-node3 41175 1727204678.32101: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204678.32103: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.32123: getting variables 41175 1727204678.32125: in VariableManager get_vars() 41175 1727204678.32174: Calling all_inventory to load vars for managed-node3 41175 1727204678.32178: Calling groups_inventory to load vars for managed-node3 41175 1727204678.32181: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.32397: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.32402: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.32409: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.35095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.38088: done with get_vars() 41175 1727204678.38140: done getting variables 41175 1727204678.38215: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.118) 0:00:45.521 ***** 41175 1727204678.38255: entering _queue_task() for managed-node3/fail 41175 1727204678.38738: worker is 1 (out of 1 available) 41175 1727204678.38754: exiting _queue_task() for managed-node3/fail 41175 1727204678.38765: done queuing things up, now waiting for results queue to drain 41175 1727204678.38768: waiting for pending results... 41175 1727204678.39112: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41175 1727204678.39164: in run() - task 12b410aa-8751-f070-39c4-0000000000de 41175 1727204678.39296: variable 'ansible_search_path' from source: unknown 41175 1727204678.39299: variable 'ansible_search_path' from source: unknown 41175 1727204678.39302: calling self._execute() 41175 1727204678.39362: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.39378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.39399: variable 'omit' from source: magic vars 41175 1727204678.39870: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.39891: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.40057: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204678.40330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204678.43060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204678.43156: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204678.43234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204678.43265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204678.43303: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204678.43595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.43599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.43603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.43605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.43608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.43644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.43678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.43722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.43781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.43806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.43871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.43909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.43955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.44013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.44039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.44284: variable 'network_connections' from source: play vars 41175 1727204678.44306: variable 'profile' from source: play vars 41175 1727204678.44407: variable 'profile' from source: play vars 41175 1727204678.44420: variable 'interface' from source: set_fact 41175 1727204678.44504: variable 'interface' from source: set_fact 41175 1727204678.44605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204678.44841: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204678.44925: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204678.44944: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204678.44984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204678.45049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204678.45081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204678.45142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.45166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204678.45231: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204678.45582: variable 'network_connections' from source: play vars 41175 1727204678.45684: variable 'profile' from source: play vars 41175 1727204678.45687: variable 'profile' from source: play vars 41175 1727204678.45692: variable 'interface' from source: set_fact 41175 1727204678.45765: variable 'interface' from source: set_fact 41175 1727204678.45804: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204678.45812: when evaluation is False, skipping this task 41175 1727204678.45822: _execute() done 41175 1727204678.45831: dumping result to json 41175 1727204678.45839: done dumping result, returning 41175 1727204678.45850: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000de] 41175 1727204678.45868: sending task result for task 12b410aa-8751-f070-39c4-0000000000de skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204678.46060: no more pending results, returning what we have 41175 1727204678.46064: results queue empty 41175 1727204678.46065: checking for any_errors_fatal 41175 1727204678.46074: done checking for any_errors_fatal 41175 1727204678.46075: checking for max_fail_percentage 41175 1727204678.46077: done checking for max_fail_percentage 41175 1727204678.46078: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.46079: done checking to see if all hosts have failed 41175 1727204678.46080: getting the remaining hosts for this loop 41175 1727204678.46082: done getting the remaining hosts for this loop 41175 1727204678.46087: getting the next task for host managed-node3 41175 1727204678.46096: done getting next task for host managed-node3 41175 1727204678.46101: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41175 1727204678.46103: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.46123: getting variables 41175 1727204678.46125: in VariableManager get_vars() 41175 1727204678.46169: Calling all_inventory to load vars for managed-node3 41175 1727204678.46172: Calling groups_inventory to load vars for managed-node3 41175 1727204678.46175: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.46188: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.46497: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.46503: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.47206: done sending task result for task 12b410aa-8751-f070-39c4-0000000000de 41175 1727204678.47210: WORKER PROCESS EXITING 41175 1727204678.48838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.51982: done with get_vars() 41175 1727204678.52033: done getting variables 41175 1727204678.52111: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.138) 0:00:45.660 ***** 41175 1727204678.52157: entering _queue_task() for managed-node3/package 41175 1727204678.52598: worker is 1 (out of 1 available) 41175 1727204678.52614: exiting _queue_task() for managed-node3/package 41175 1727204678.52627: done queuing things up, now waiting for results queue to drain 41175 1727204678.52629: waiting for pending results... 41175 1727204678.52870: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41175 1727204678.52974: in run() - task 12b410aa-8751-f070-39c4-0000000000df 41175 1727204678.52989: variable 'ansible_search_path' from source: unknown 41175 1727204678.52996: variable 'ansible_search_path' from source: unknown 41175 1727204678.53034: calling self._execute() 41175 1727204678.53120: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.53130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.53141: variable 'omit' from source: magic vars 41175 1727204678.53473: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.53484: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.53665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204678.53896: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204678.53936: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204678.53968: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204678.54001: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204678.54097: variable 'network_packages' from source: role '' defaults 41175 1727204678.54187: variable '__network_provider_setup' from source: role '' defaults 41175 1727204678.54202: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204678.54263: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204678.54272: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204678.59930: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204678.60160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204678.61752: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204678.61811: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204678.61843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204678.61870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204678.61897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204678.61957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.61981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.62009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.62044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.62056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.62096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.62120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.62142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.62173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.62185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.62373: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204678.62472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.62493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.62514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.62551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.62564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.62640: variable 'ansible_python' from source: facts 41175 1727204678.62663: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204678.62734: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204678.62805: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204678.62917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.62938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.62960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.62995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.63008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.63050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.63075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.63099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.63132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.63146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.63266: variable 'network_connections' from source: play vars 41175 1727204678.63271: variable 'profile' from source: play vars 41175 1727204678.63358: variable 'profile' from source: play vars 41175 1727204678.63365: variable 'interface' from source: set_fact 41175 1727204678.63430: variable 'interface' from source: set_fact 41175 1727204678.63483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204678.63509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204678.63540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.63567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204678.63599: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204678.63837: variable 'network_connections' from source: play vars 41175 1727204678.63841: variable 'profile' from source: play vars 41175 1727204678.63927: variable 'profile' from source: play vars 41175 1727204678.63933: variable 'interface' from source: set_fact 41175 1727204678.63996: variable 'interface' from source: set_fact 41175 1727204678.64026: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204678.64094: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204678.64362: variable 'network_connections' from source: play vars 41175 1727204678.64366: variable 'profile' from source: play vars 41175 1727204678.64427: variable 'profile' from source: play vars 41175 1727204678.64431: variable 'interface' from source: set_fact 41175 1727204678.64514: variable 'interface' from source: set_fact 41175 1727204678.64539: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204678.64603: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204678.64870: variable 'network_connections' from source: play vars 41175 1727204678.64874: variable 'profile' from source: play vars 41175 1727204678.64932: variable 'profile' from source: play vars 41175 1727204678.64937: variable 'interface' from source: set_fact 41175 1727204678.65022: variable 'interface' from source: set_fact 41175 1727204678.65070: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204678.65122: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204678.65130: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204678.65184: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204678.65365: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204678.65761: variable 'network_connections' from source: play vars 41175 1727204678.65765: variable 'profile' from source: play vars 41175 1727204678.65818: variable 'profile' from source: play vars 41175 1727204678.65826: variable 'interface' from source: set_fact 41175 1727204678.65879: variable 'interface' from source: set_fact 41175 1727204678.65888: variable 'ansible_distribution' from source: facts 41175 1727204678.65894: variable '__network_rh_distros' from source: role '' defaults 41175 1727204678.65901: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.65914: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204678.66056: variable 'ansible_distribution' from source: facts 41175 1727204678.66060: variable '__network_rh_distros' from source: role '' defaults 41175 1727204678.66066: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.66073: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204678.66212: variable 'ansible_distribution' from source: facts 41175 1727204678.66215: variable '__network_rh_distros' from source: role '' defaults 41175 1727204678.66224: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.66253: variable 'network_provider' from source: set_fact 41175 1727204678.66269: variable 'ansible_facts' from source: unknown 41175 1727204678.66880: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41175 1727204678.66884: when evaluation is False, skipping this task 41175 1727204678.66887: _execute() done 41175 1727204678.66891: dumping result to json 41175 1727204678.66893: done dumping result, returning 41175 1727204678.66903: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-f070-39c4-0000000000df] 41175 1727204678.66906: sending task result for task 12b410aa-8751-f070-39c4-0000000000df 41175 1727204678.67006: done sending task result for task 12b410aa-8751-f070-39c4-0000000000df 41175 1727204678.67009: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41175 1727204678.67068: no more pending results, returning what we have 41175 1727204678.67071: results queue empty 41175 1727204678.67072: checking for any_errors_fatal 41175 1727204678.67082: done checking for any_errors_fatal 41175 1727204678.67083: checking for max_fail_percentage 41175 1727204678.67084: done checking for max_fail_percentage 41175 1727204678.67085: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.67086: done checking to see if all hosts have failed 41175 1727204678.67087: getting the remaining hosts for this loop 41175 1727204678.67091: done getting the remaining hosts for this loop 41175 1727204678.67095: getting the next task for host managed-node3 41175 1727204678.67103: done getting next task for host managed-node3 41175 1727204678.67107: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204678.67110: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.67127: getting variables 41175 1727204678.67129: in VariableManager get_vars() 41175 1727204678.67170: Calling all_inventory to load vars for managed-node3 41175 1727204678.67173: Calling groups_inventory to load vars for managed-node3 41175 1727204678.67176: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.67200: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.67203: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.67207: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.72294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.73893: done with get_vars() 41175 1727204678.73917: done getting variables 41175 1727204678.73964: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.218) 0:00:45.878 ***** 41175 1727204678.73985: entering _queue_task() for managed-node3/package 41175 1727204678.74259: worker is 1 (out of 1 available) 41175 1727204678.74275: exiting _queue_task() for managed-node3/package 41175 1727204678.74291: done queuing things up, now waiting for results queue to drain 41175 1727204678.74294: waiting for pending results... 41175 1727204678.74494: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41175 1727204678.74614: in run() - task 12b410aa-8751-f070-39c4-0000000000e0 41175 1727204678.74630: variable 'ansible_search_path' from source: unknown 41175 1727204678.74634: variable 'ansible_search_path' from source: unknown 41175 1727204678.74669: calling self._execute() 41175 1727204678.74756: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.74765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.74777: variable 'omit' from source: magic vars 41175 1727204678.75102: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.75113: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.75223: variable 'network_state' from source: role '' defaults 41175 1727204678.75229: Evaluated conditional (network_state != {}): False 41175 1727204678.75232: when evaluation is False, skipping this task 41175 1727204678.75238: _execute() done 41175 1727204678.75242: dumping result to json 41175 1727204678.75244: done dumping result, returning 41175 1727204678.75255: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-f070-39c4-0000000000e0] 41175 1727204678.75262: sending task result for task 12b410aa-8751-f070-39c4-0000000000e0 41175 1727204678.75366: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e0 41175 1727204678.75369: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204678.75422: no more pending results, returning what we have 41175 1727204678.75427: results queue empty 41175 1727204678.75428: checking for any_errors_fatal 41175 1727204678.75439: done checking for any_errors_fatal 41175 1727204678.75440: checking for max_fail_percentage 41175 1727204678.75442: done checking for max_fail_percentage 41175 1727204678.75443: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.75444: done checking to see if all hosts have failed 41175 1727204678.75445: getting the remaining hosts for this loop 41175 1727204678.75447: done getting the remaining hosts for this loop 41175 1727204678.75451: getting the next task for host managed-node3 41175 1727204678.75457: done getting next task for host managed-node3 41175 1727204678.75461: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204678.75463: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.75478: getting variables 41175 1727204678.75480: in VariableManager get_vars() 41175 1727204678.75520: Calling all_inventory to load vars for managed-node3 41175 1727204678.75523: Calling groups_inventory to load vars for managed-node3 41175 1727204678.75526: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.75536: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.75539: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.75542: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.76848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.78486: done with get_vars() 41175 1727204678.78510: done getting variables 41175 1727204678.78561: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.045) 0:00:45.924 ***** 41175 1727204678.78584: entering _queue_task() for managed-node3/package 41175 1727204678.78829: worker is 1 (out of 1 available) 41175 1727204678.78844: exiting _queue_task() for managed-node3/package 41175 1727204678.78857: done queuing things up, now waiting for results queue to drain 41175 1727204678.78859: waiting for pending results... 41175 1727204678.79053: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41175 1727204678.79139: in run() - task 12b410aa-8751-f070-39c4-0000000000e1 41175 1727204678.79154: variable 'ansible_search_path' from source: unknown 41175 1727204678.79158: variable 'ansible_search_path' from source: unknown 41175 1727204678.79192: calling self._execute() 41175 1727204678.79276: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.79283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.79294: variable 'omit' from source: magic vars 41175 1727204678.79620: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.79631: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.79734: variable 'network_state' from source: role '' defaults 41175 1727204678.79746: Evaluated conditional (network_state != {}): False 41175 1727204678.79749: when evaluation is False, skipping this task 41175 1727204678.79754: _execute() done 41175 1727204678.79757: dumping result to json 41175 1727204678.79767: done dumping result, returning 41175 1727204678.79771: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-f070-39c4-0000000000e1] 41175 1727204678.79777: sending task result for task 12b410aa-8751-f070-39c4-0000000000e1 41175 1727204678.79879: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e1 41175 1727204678.79882: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204678.79937: no more pending results, returning what we have 41175 1727204678.79941: results queue empty 41175 1727204678.79942: checking for any_errors_fatal 41175 1727204678.79950: done checking for any_errors_fatal 41175 1727204678.79951: checking for max_fail_percentage 41175 1727204678.79953: done checking for max_fail_percentage 41175 1727204678.79954: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.79955: done checking to see if all hosts have failed 41175 1727204678.79956: getting the remaining hosts for this loop 41175 1727204678.79958: done getting the remaining hosts for this loop 41175 1727204678.79962: getting the next task for host managed-node3 41175 1727204678.79967: done getting next task for host managed-node3 41175 1727204678.79971: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204678.79973: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.79987: getting variables 41175 1727204678.79991: in VariableManager get_vars() 41175 1727204678.80028: Calling all_inventory to load vars for managed-node3 41175 1727204678.80031: Calling groups_inventory to load vars for managed-node3 41175 1727204678.80033: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.80044: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.80047: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.80051: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.81269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.82986: done with get_vars() 41175 1727204678.83014: done getting variables 41175 1727204678.83061: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.044) 0:00:45.969 ***** 41175 1727204678.83084: entering _queue_task() for managed-node3/service 41175 1727204678.83314: worker is 1 (out of 1 available) 41175 1727204678.83330: exiting _queue_task() for managed-node3/service 41175 1727204678.83342: done queuing things up, now waiting for results queue to drain 41175 1727204678.83344: waiting for pending results... 41175 1727204678.83531: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41175 1727204678.83608: in run() - task 12b410aa-8751-f070-39c4-0000000000e2 41175 1727204678.83622: variable 'ansible_search_path' from source: unknown 41175 1727204678.83626: variable 'ansible_search_path' from source: unknown 41175 1727204678.83656: calling self._execute() 41175 1727204678.83738: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.83745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.83755: variable 'omit' from source: magic vars 41175 1727204678.84056: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.84067: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.84172: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204678.84349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204678.86087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204678.86146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204678.86178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204678.86211: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204678.86235: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204678.86309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.86334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.86356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.86388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.86403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.86448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.86469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.86490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.86525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.86539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.86573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.86595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.86619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.86654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.86667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.86807: variable 'network_connections' from source: play vars 41175 1727204678.86820: variable 'profile' from source: play vars 41175 1727204678.86881: variable 'profile' from source: play vars 41175 1727204678.86885: variable 'interface' from source: set_fact 41175 1727204678.86939: variable 'interface' from source: set_fact 41175 1727204678.87003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204678.87145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204678.87181: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204678.87211: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204678.87237: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204678.87275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204678.87299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204678.87321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.87341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204678.87384: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204678.87590: variable 'network_connections' from source: play vars 41175 1727204678.87594: variable 'profile' from source: play vars 41175 1727204678.87650: variable 'profile' from source: play vars 41175 1727204678.87654: variable 'interface' from source: set_fact 41175 1727204678.87704: variable 'interface' from source: set_fact 41175 1727204678.87731: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41175 1727204678.87734: when evaluation is False, skipping this task 41175 1727204678.87737: _execute() done 41175 1727204678.87740: dumping result to json 41175 1727204678.87742: done dumping result, returning 41175 1727204678.87750: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-f070-39c4-0000000000e2] 41175 1727204678.87761: sending task result for task 12b410aa-8751-f070-39c4-0000000000e2 41175 1727204678.87853: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e2 41175 1727204678.87856: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41175 1727204678.87906: no more pending results, returning what we have 41175 1727204678.87910: results queue empty 41175 1727204678.87911: checking for any_errors_fatal 41175 1727204678.87922: done checking for any_errors_fatal 41175 1727204678.87923: checking for max_fail_percentage 41175 1727204678.87925: done checking for max_fail_percentage 41175 1727204678.87926: checking to see if all hosts have failed and the running result is not ok 41175 1727204678.87927: done checking to see if all hosts have failed 41175 1727204678.87928: getting the remaining hosts for this loop 41175 1727204678.87929: done getting the remaining hosts for this loop 41175 1727204678.87934: getting the next task for host managed-node3 41175 1727204678.87940: done getting next task for host managed-node3 41175 1727204678.87945: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204678.87947: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204678.87963: getting variables 41175 1727204678.87965: in VariableManager get_vars() 41175 1727204678.88012: Calling all_inventory to load vars for managed-node3 41175 1727204678.88015: Calling groups_inventory to load vars for managed-node3 41175 1727204678.88021: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204678.88031: Calling all_plugins_play to load vars for managed-node3 41175 1727204678.88034: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204678.88038: Calling groups_plugins_play to load vars for managed-node3 41175 1727204678.89312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204678.90944: done with get_vars() 41175 1727204678.90967: done getting variables 41175 1727204678.91020: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.079) 0:00:46.049 ***** 41175 1727204678.91047: entering _queue_task() for managed-node3/service 41175 1727204678.91306: worker is 1 (out of 1 available) 41175 1727204678.91324: exiting _queue_task() for managed-node3/service 41175 1727204678.91337: done queuing things up, now waiting for results queue to drain 41175 1727204678.91339: waiting for pending results... 41175 1727204678.91536: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41175 1727204678.91627: in run() - task 12b410aa-8751-f070-39c4-0000000000e3 41175 1727204678.91640: variable 'ansible_search_path' from source: unknown 41175 1727204678.91644: variable 'ansible_search_path' from source: unknown 41175 1727204678.91677: calling self._execute() 41175 1727204678.91771: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.91778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.91796: variable 'omit' from source: magic vars 41175 1727204678.92123: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.92137: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204678.92277: variable 'network_provider' from source: set_fact 41175 1727204678.92281: variable 'network_state' from source: role '' defaults 41175 1727204678.92293: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41175 1727204678.92300: variable 'omit' from source: magic vars 41175 1727204678.92337: variable 'omit' from source: magic vars 41175 1727204678.92365: variable 'network_service_name' from source: role '' defaults 41175 1727204678.92430: variable 'network_service_name' from source: role '' defaults 41175 1727204678.92523: variable '__network_provider_setup' from source: role '' defaults 41175 1727204678.92530: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204678.92587: variable '__network_service_name_default_nm' from source: role '' defaults 41175 1727204678.92597: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204678.92651: variable '__network_packages_default_nm' from source: role '' defaults 41175 1727204678.92851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204678.94578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204678.94918: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204678.94954: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204678.94985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204678.95010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204678.95083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.95109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.95133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.95170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.95183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.95228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.95247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.95268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.95306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.95318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.95510: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41175 1727204678.95606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.95632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.95652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.95682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.95695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.95773: variable 'ansible_python' from source: facts 41175 1727204678.95794: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41175 1727204678.95868: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204678.95939: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204678.96047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.96070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.96091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.96124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.96140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.96181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204678.96207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204678.96230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.96263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204678.96277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204678.96396: variable 'network_connections' from source: play vars 41175 1727204678.96403: variable 'profile' from source: play vars 41175 1727204678.96466: variable 'profile' from source: play vars 41175 1727204678.96479: variable 'interface' from source: set_fact 41175 1727204678.96534: variable 'interface' from source: set_fact 41175 1727204678.96623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204678.96765: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204678.96823: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204678.96857: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204678.96893: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204678.96947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204678.96972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204678.97000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204678.97033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204678.97072: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204678.97305: variable 'network_connections' from source: play vars 41175 1727204678.97312: variable 'profile' from source: play vars 41175 1727204678.97376: variable 'profile' from source: play vars 41175 1727204678.97382: variable 'interface' from source: set_fact 41175 1727204678.97435: variable 'interface' from source: set_fact 41175 1727204678.97463: variable '__network_packages_default_wireless' from source: role '' defaults 41175 1727204678.97533: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204678.97805: variable 'network_connections' from source: play vars 41175 1727204678.97809: variable 'profile' from source: play vars 41175 1727204678.97868: variable 'profile' from source: play vars 41175 1727204678.97871: variable 'interface' from source: set_fact 41175 1727204678.97940: variable 'interface' from source: set_fact 41175 1727204678.97963: variable '__network_packages_default_team' from source: role '' defaults 41175 1727204678.98032: variable '__network_team_connections_defined' from source: role '' defaults 41175 1727204678.98270: variable 'network_connections' from source: play vars 41175 1727204678.98274: variable 'profile' from source: play vars 41175 1727204678.98339: variable 'profile' from source: play vars 41175 1727204678.98343: variable 'interface' from source: set_fact 41175 1727204678.98403: variable 'interface' from source: set_fact 41175 1727204678.98453: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204678.98504: variable '__network_service_name_default_initscripts' from source: role '' defaults 41175 1727204678.98511: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204678.98564: variable '__network_packages_default_initscripts' from source: role '' defaults 41175 1727204678.98746: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41175 1727204678.99172: variable 'network_connections' from source: play vars 41175 1727204678.99176: variable 'profile' from source: play vars 41175 1727204678.99230: variable 'profile' from source: play vars 41175 1727204678.99234: variable 'interface' from source: set_fact 41175 1727204678.99292: variable 'interface' from source: set_fact 41175 1727204678.99302: variable 'ansible_distribution' from source: facts 41175 1727204678.99305: variable '__network_rh_distros' from source: role '' defaults 41175 1727204678.99322: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.99328: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41175 1727204678.99471: variable 'ansible_distribution' from source: facts 41175 1727204678.99475: variable '__network_rh_distros' from source: role '' defaults 41175 1727204678.99481: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.99488: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41175 1727204678.99633: variable 'ansible_distribution' from source: facts 41175 1727204678.99638: variable '__network_rh_distros' from source: role '' defaults 41175 1727204678.99648: variable 'ansible_distribution_major_version' from source: facts 41175 1727204678.99675: variable 'network_provider' from source: set_fact 41175 1727204678.99696: variable 'omit' from source: magic vars 41175 1727204678.99722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204678.99745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204678.99763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204678.99782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204678.99793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204678.99823: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204678.99826: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204678.99831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204678.99925: Set connection var ansible_shell_executable to /bin/sh 41175 1727204678.99928: Set connection var ansible_shell_type to sh 41175 1727204678.99935: Set connection var ansible_pipelining to False 41175 1727204678.99943: Set connection var ansible_timeout to 10 41175 1727204678.99950: Set connection var ansible_connection to ssh 41175 1727204678.99957: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204678.99983: variable 'ansible_shell_executable' from source: unknown 41175 1727204678.99986: variable 'ansible_connection' from source: unknown 41175 1727204678.99991: variable 'ansible_module_compression' from source: unknown 41175 1727204678.99994: variable 'ansible_shell_type' from source: unknown 41175 1727204678.99996: variable 'ansible_shell_executable' from source: unknown 41175 1727204678.99998: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204679.00005: variable 'ansible_pipelining' from source: unknown 41175 1727204679.00008: variable 'ansible_timeout' from source: unknown 41175 1727204679.00012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204679.00099: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204679.00109: variable 'omit' from source: magic vars 41175 1727204679.00116: starting attempt loop 41175 1727204679.00121: running the handler 41175 1727204679.00188: variable 'ansible_facts' from source: unknown 41175 1727204679.00840: _low_level_execute_command(): starting 41175 1727204679.00846: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204679.01394: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.01398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.01401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204679.01404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204679.01406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.01464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204679.01472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.01511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.03267: stdout chunk (state=3): >>>/root <<< 41175 1727204679.03377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204679.03448: stderr chunk (state=3): >>><<< 41175 1727204679.03451: stdout chunk (state=3): >>><<< 41175 1727204679.03464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204679.03478: _low_level_execute_command(): starting 41175 1727204679.03526: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440 `" && echo ansible-tmp-1727204679.0346973-43108-97758024637440="` echo /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440 `" ) && sleep 0' 41175 1727204679.03960: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204679.03963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204679.03967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.03970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.03972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.04023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204679.04027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.04075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.06350: stdout chunk (state=3): >>>ansible-tmp-1727204679.0346973-43108-97758024637440=/root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440 <<< 41175 1727204679.06602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204679.06606: stdout chunk (state=3): >>><<< 41175 1727204679.06609: stderr chunk (state=3): >>><<< 41175 1727204679.06612: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204679.0346973-43108-97758024637440=/root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204679.06614: variable 'ansible_module_compression' from source: unknown 41175 1727204679.06668: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41175 1727204679.06820: variable 'ansible_facts' from source: unknown 41175 1727204679.07320: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/AnsiballZ_systemd.py 41175 1727204679.07802: Sending initial data 41175 1727204679.07805: Sent initial data (155 bytes) 41175 1727204679.08854: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.08857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.08860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.08862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.09208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.09271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.10898: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41175 1727204679.10913: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 41175 1727204679.10929: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 41175 1727204679.10946: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204679.11003: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204679.11060: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpkizqpegu /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/AnsiballZ_systemd.py <<< 41175 1727204679.11092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpkizqpegu" to remote "/root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/AnsiballZ_systemd.py" <<< 41175 1727204679.13495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204679.13511: stderr chunk (state=3): >>><<< 41175 1727204679.13536: stdout chunk (state=3): >>><<< 41175 1727204679.13571: done transferring module to remote 41175 1727204679.13592: _low_level_execute_command(): starting 41175 1727204679.13607: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/ /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/AnsiballZ_systemd.py && sleep 0' 41175 1727204679.14272: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204679.14288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204679.14309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.14329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204679.14363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.14467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204679.14488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.14560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.16464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204679.16480: stdout chunk (state=3): >>><<< 41175 1727204679.16506: stderr chunk (state=3): >>><<< 41175 1727204679.16524: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204679.16617: _low_level_execute_command(): starting 41175 1727204679.16623: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/AnsiballZ_systemd.py && sleep 0' 41175 1727204679.17281: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204679.17309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204679.17324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.17404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.50150: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11870208", "MemoryAvailable": "infinity", "CPUUsageNSec": "1961996000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 41175 1727204679.50193: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 41175 1727204679.50227: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41175 1727204679.52303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204679.52307: stdout chunk (state=3): >>><<< 41175 1727204679.52310: stderr chunk (state=3): >>><<< 41175 1727204679.52333: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11870208", "MemoryAvailable": "infinity", "CPUUsageNSec": "1961996000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204679.52783: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204679.52787: _low_level_execute_command(): starting 41175 1727204679.52793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204679.0346973-43108-97758024637440/ > /dev/null 2>&1 && sleep 0' 41175 1727204679.53403: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204679.53424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204679.53456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.53560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.53593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204679.53613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204679.53640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.53719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.55802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204679.55807: stdout chunk (state=3): >>><<< 41175 1727204679.55810: stderr chunk (state=3): >>><<< 41175 1727204679.55995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204679.55999: handler run complete 41175 1727204679.56002: attempt loop complete, returning result 41175 1727204679.56005: _execute() done 41175 1727204679.56007: dumping result to json 41175 1727204679.56009: done dumping result, returning 41175 1727204679.56012: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-f070-39c4-0000000000e3] 41175 1727204679.56014: sending task result for task 12b410aa-8751-f070-39c4-0000000000e3 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204679.56624: no more pending results, returning what we have 41175 1727204679.56629: results queue empty 41175 1727204679.56630: checking for any_errors_fatal 41175 1727204679.56640: done checking for any_errors_fatal 41175 1727204679.56641: checking for max_fail_percentage 41175 1727204679.56643: done checking for max_fail_percentage 41175 1727204679.56644: checking to see if all hosts have failed and the running result is not ok 41175 1727204679.56645: done checking to see if all hosts have failed 41175 1727204679.56646: getting the remaining hosts for this loop 41175 1727204679.56648: done getting the remaining hosts for this loop 41175 1727204679.56654: getting the next task for host managed-node3 41175 1727204679.56661: done getting next task for host managed-node3 41175 1727204679.56666: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204679.56669: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204679.56681: getting variables 41175 1727204679.56683: in VariableManager get_vars() 41175 1727204679.56744: Calling all_inventory to load vars for managed-node3 41175 1727204679.56748: Calling groups_inventory to load vars for managed-node3 41175 1727204679.56751: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204679.56764: Calling all_plugins_play to load vars for managed-node3 41175 1727204679.56768: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204679.56772: Calling groups_plugins_play to load vars for managed-node3 41175 1727204679.57295: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e3 41175 1727204679.57303: WORKER PROCESS EXITING 41175 1727204679.58764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204679.60974: done with get_vars() 41175 1727204679.61012: done getting variables 41175 1727204679.61085: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:39 -0400 (0:00:00.700) 0:00:46.750 ***** 41175 1727204679.61126: entering _queue_task() for managed-node3/service 41175 1727204679.61510: worker is 1 (out of 1 available) 41175 1727204679.61528: exiting _queue_task() for managed-node3/service 41175 1727204679.61542: done queuing things up, now waiting for results queue to drain 41175 1727204679.61544: waiting for pending results... 41175 1727204679.61864: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41175 1727204679.62012: in run() - task 12b410aa-8751-f070-39c4-0000000000e4 41175 1727204679.62033: variable 'ansible_search_path' from source: unknown 41175 1727204679.62037: variable 'ansible_search_path' from source: unknown 41175 1727204679.62070: calling self._execute() 41175 1727204679.62162: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204679.62169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204679.62180: variable 'omit' from source: magic vars 41175 1727204679.62524: variable 'ansible_distribution_major_version' from source: facts 41175 1727204679.62536: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204679.62639: variable 'network_provider' from source: set_fact 41175 1727204679.62646: Evaluated conditional (network_provider == "nm"): True 41175 1727204679.62726: variable '__network_wpa_supplicant_required' from source: role '' defaults 41175 1727204679.62805: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41175 1727204679.62962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204679.65040: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204679.65081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204679.65115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204679.65152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204679.65175: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204679.65264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204679.65288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204679.65311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204679.65351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204679.65364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204679.65406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204679.65429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204679.65451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204679.65486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204679.65501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204679.65538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204679.65563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204679.65586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204679.65619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204679.65633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204679.65754: variable 'network_connections' from source: play vars 41175 1727204679.65764: variable 'profile' from source: play vars 41175 1727204679.65830: variable 'profile' from source: play vars 41175 1727204679.65836: variable 'interface' from source: set_fact 41175 1727204679.65886: variable 'interface' from source: set_fact 41175 1727204679.65949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41175 1727204679.66081: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41175 1727204679.66118: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41175 1727204679.66147: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41175 1727204679.66172: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41175 1727204679.66210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41175 1727204679.66237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41175 1727204679.66256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204679.66277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41175 1727204679.66328: variable '__network_wireless_connections_defined' from source: role '' defaults 41175 1727204679.66547: variable 'network_connections' from source: play vars 41175 1727204679.66552: variable 'profile' from source: play vars 41175 1727204679.66604: variable 'profile' from source: play vars 41175 1727204679.66608: variable 'interface' from source: set_fact 41175 1727204679.66662: variable 'interface' from source: set_fact 41175 1727204679.66688: Evaluated conditional (__network_wpa_supplicant_required): False 41175 1727204679.66693: when evaluation is False, skipping this task 41175 1727204679.66696: _execute() done 41175 1727204679.66706: dumping result to json 41175 1727204679.66709: done dumping result, returning 41175 1727204679.66712: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-f070-39c4-0000000000e4] 41175 1727204679.66718: sending task result for task 12b410aa-8751-f070-39c4-0000000000e4 41175 1727204679.66812: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e4 41175 1727204679.66815: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41175 1727204679.66867: no more pending results, returning what we have 41175 1727204679.66871: results queue empty 41175 1727204679.66872: checking for any_errors_fatal 41175 1727204679.66897: done checking for any_errors_fatal 41175 1727204679.66898: checking for max_fail_percentage 41175 1727204679.66900: done checking for max_fail_percentage 41175 1727204679.66901: checking to see if all hosts have failed and the running result is not ok 41175 1727204679.66902: done checking to see if all hosts have failed 41175 1727204679.66903: getting the remaining hosts for this loop 41175 1727204679.66905: done getting the remaining hosts for this loop 41175 1727204679.66910: getting the next task for host managed-node3 41175 1727204679.66917: done getting next task for host managed-node3 41175 1727204679.66921: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204679.66923: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204679.66939: getting variables 41175 1727204679.66941: in VariableManager get_vars() 41175 1727204679.66980: Calling all_inventory to load vars for managed-node3 41175 1727204679.66983: Calling groups_inventory to load vars for managed-node3 41175 1727204679.66986: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204679.67004: Calling all_plugins_play to load vars for managed-node3 41175 1727204679.67007: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204679.67011: Calling groups_plugins_play to load vars for managed-node3 41175 1727204679.68398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204679.70017: done with get_vars() 41175 1727204679.70043: done getting variables 41175 1727204679.70092: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:39 -0400 (0:00:00.089) 0:00:46.840 ***** 41175 1727204679.70115: entering _queue_task() for managed-node3/service 41175 1727204679.70362: worker is 1 (out of 1 available) 41175 1727204679.70378: exiting _queue_task() for managed-node3/service 41175 1727204679.70392: done queuing things up, now waiting for results queue to drain 41175 1727204679.70395: waiting for pending results... 41175 1727204679.70584: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41175 1727204679.70664: in run() - task 12b410aa-8751-f070-39c4-0000000000e5 41175 1727204679.70678: variable 'ansible_search_path' from source: unknown 41175 1727204679.70682: variable 'ansible_search_path' from source: unknown 41175 1727204679.70715: calling self._execute() 41175 1727204679.70805: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204679.70813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204679.70825: variable 'omit' from source: magic vars 41175 1727204679.71159: variable 'ansible_distribution_major_version' from source: facts 41175 1727204679.71178: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204679.71273: variable 'network_provider' from source: set_fact 41175 1727204679.71277: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204679.71281: when evaluation is False, skipping this task 41175 1727204679.71286: _execute() done 41175 1727204679.71289: dumping result to json 41175 1727204679.71302: done dumping result, returning 41175 1727204679.71306: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-f070-39c4-0000000000e5] 41175 1727204679.71311: sending task result for task 12b410aa-8751-f070-39c4-0000000000e5 41175 1727204679.71407: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e5 41175 1727204679.71410: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41175 1727204679.71457: no more pending results, returning what we have 41175 1727204679.71463: results queue empty 41175 1727204679.71464: checking for any_errors_fatal 41175 1727204679.71477: done checking for any_errors_fatal 41175 1727204679.71478: checking for max_fail_percentage 41175 1727204679.71480: done checking for max_fail_percentage 41175 1727204679.71481: checking to see if all hosts have failed and the running result is not ok 41175 1727204679.71482: done checking to see if all hosts have failed 41175 1727204679.71483: getting the remaining hosts for this loop 41175 1727204679.71485: done getting the remaining hosts for this loop 41175 1727204679.71491: getting the next task for host managed-node3 41175 1727204679.71497: done getting next task for host managed-node3 41175 1727204679.71501: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204679.71504: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204679.71519: getting variables 41175 1727204679.71520: in VariableManager get_vars() 41175 1727204679.71556: Calling all_inventory to load vars for managed-node3 41175 1727204679.71559: Calling groups_inventory to load vars for managed-node3 41175 1727204679.71562: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204679.71572: Calling all_plugins_play to load vars for managed-node3 41175 1727204679.71575: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204679.71578: Calling groups_plugins_play to load vars for managed-node3 41175 1727204679.72830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204679.74457: done with get_vars() 41175 1727204679.74480: done getting variables 41175 1727204679.74536: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:39 -0400 (0:00:00.044) 0:00:46.884 ***** 41175 1727204679.74562: entering _queue_task() for managed-node3/copy 41175 1727204679.74824: worker is 1 (out of 1 available) 41175 1727204679.74840: exiting _queue_task() for managed-node3/copy 41175 1727204679.74852: done queuing things up, now waiting for results queue to drain 41175 1727204679.74854: waiting for pending results... 41175 1727204679.75056: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41175 1727204679.75150: in run() - task 12b410aa-8751-f070-39c4-0000000000e6 41175 1727204679.75164: variable 'ansible_search_path' from source: unknown 41175 1727204679.75168: variable 'ansible_search_path' from source: unknown 41175 1727204679.75205: calling self._execute() 41175 1727204679.75288: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204679.75298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204679.75310: variable 'omit' from source: magic vars 41175 1727204679.75639: variable 'ansible_distribution_major_version' from source: facts 41175 1727204679.75646: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204679.75748: variable 'network_provider' from source: set_fact 41175 1727204679.75752: Evaluated conditional (network_provider == "initscripts"): False 41175 1727204679.75756: when evaluation is False, skipping this task 41175 1727204679.75759: _execute() done 41175 1727204679.75761: dumping result to json 41175 1727204679.75766: done dumping result, returning 41175 1727204679.75776: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-f070-39c4-0000000000e6] 41175 1727204679.75782: sending task result for task 12b410aa-8751-f070-39c4-0000000000e6 41175 1727204679.75882: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e6 41175 1727204679.75885: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41175 1727204679.75938: no more pending results, returning what we have 41175 1727204679.75943: results queue empty 41175 1727204679.75944: checking for any_errors_fatal 41175 1727204679.75954: done checking for any_errors_fatal 41175 1727204679.75955: checking for max_fail_percentage 41175 1727204679.75957: done checking for max_fail_percentage 41175 1727204679.75958: checking to see if all hosts have failed and the running result is not ok 41175 1727204679.75959: done checking to see if all hosts have failed 41175 1727204679.75960: getting the remaining hosts for this loop 41175 1727204679.75962: done getting the remaining hosts for this loop 41175 1727204679.75966: getting the next task for host managed-node3 41175 1727204679.75972: done getting next task for host managed-node3 41175 1727204679.75976: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204679.75978: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204679.75995: getting variables 41175 1727204679.75997: in VariableManager get_vars() 41175 1727204679.76031: Calling all_inventory to load vars for managed-node3 41175 1727204679.76034: Calling groups_inventory to load vars for managed-node3 41175 1727204679.76037: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204679.76047: Calling all_plugins_play to load vars for managed-node3 41175 1727204679.76050: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204679.76053: Calling groups_plugins_play to load vars for managed-node3 41175 1727204679.77416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204679.79018: done with get_vars() 41175 1727204679.79044: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:39 -0400 (0:00:00.045) 0:00:46.930 ***** 41175 1727204679.79111: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204679.79361: worker is 1 (out of 1 available) 41175 1727204679.79375: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41175 1727204679.79391: done queuing things up, now waiting for results queue to drain 41175 1727204679.79393: waiting for pending results... 41175 1727204679.79587: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41175 1727204679.79668: in run() - task 12b410aa-8751-f070-39c4-0000000000e7 41175 1727204679.79683: variable 'ansible_search_path' from source: unknown 41175 1727204679.79687: variable 'ansible_search_path' from source: unknown 41175 1727204679.79725: calling self._execute() 41175 1727204679.79813: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204679.79821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204679.79831: variable 'omit' from source: magic vars 41175 1727204679.80167: variable 'ansible_distribution_major_version' from source: facts 41175 1727204679.80182: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204679.80186: variable 'omit' from source: magic vars 41175 1727204679.80222: variable 'omit' from source: magic vars 41175 1727204679.80362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41175 1727204679.82068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41175 1727204679.82125: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41175 1727204679.82160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41175 1727204679.82192: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41175 1727204679.82214: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41175 1727204679.82286: variable 'network_provider' from source: set_fact 41175 1727204679.82403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41175 1727204679.82440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41175 1727204679.82461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41175 1727204679.82500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41175 1727204679.82513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41175 1727204679.82576: variable 'omit' from source: magic vars 41175 1727204679.82672: variable 'omit' from source: magic vars 41175 1727204679.82782: variable 'network_connections' from source: play vars 41175 1727204679.82795: variable 'profile' from source: play vars 41175 1727204679.82856: variable 'profile' from source: play vars 41175 1727204679.82860: variable 'interface' from source: set_fact 41175 1727204679.82916: variable 'interface' from source: set_fact 41175 1727204679.83035: variable 'omit' from source: magic vars 41175 1727204679.83044: variable '__lsr_ansible_managed' from source: task vars 41175 1727204679.83094: variable '__lsr_ansible_managed' from source: task vars 41175 1727204679.83328: Loaded config def from plugin (lookup/template) 41175 1727204679.83332: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41175 1727204679.83362: File lookup term: get_ansible_managed.j2 41175 1727204679.83366: variable 'ansible_search_path' from source: unknown 41175 1727204679.83369: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41175 1727204679.83384: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41175 1727204679.83399: variable 'ansible_search_path' from source: unknown 41175 1727204679.91297: variable 'ansible_managed' from source: unknown 41175 1727204679.91456: variable 'omit' from source: magic vars 41175 1727204679.91498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204679.91534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204679.91560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204679.91597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204679.91614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204679.91649: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204679.91659: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204679.91667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204679.91792: Set connection var ansible_shell_executable to /bin/sh 41175 1727204679.91803: Set connection var ansible_shell_type to sh 41175 1727204679.91816: Set connection var ansible_pipelining to False 41175 1727204679.91831: Set connection var ansible_timeout to 10 41175 1727204679.91842: Set connection var ansible_connection to ssh 41175 1727204679.91852: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204679.91884: variable 'ansible_shell_executable' from source: unknown 41175 1727204679.92094: variable 'ansible_connection' from source: unknown 41175 1727204679.92097: variable 'ansible_module_compression' from source: unknown 41175 1727204679.92100: variable 'ansible_shell_type' from source: unknown 41175 1727204679.92102: variable 'ansible_shell_executable' from source: unknown 41175 1727204679.92104: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204679.92107: variable 'ansible_pipelining' from source: unknown 41175 1727204679.92109: variable 'ansible_timeout' from source: unknown 41175 1727204679.92111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204679.92113: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204679.92124: variable 'omit' from source: magic vars 41175 1727204679.92131: starting attempt loop 41175 1727204679.92139: running the handler 41175 1727204679.92159: _low_level_execute_command(): starting 41175 1727204679.92171: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204679.92892: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204679.92902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204679.92915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.92932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204679.92946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204679.92954: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204679.92964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.93060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204679.93087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.93162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.94905: stdout chunk (state=3): >>>/root <<< 41175 1727204679.95085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204679.95100: stdout chunk (state=3): >>><<< 41175 1727204679.95113: stderr chunk (state=3): >>><<< 41175 1727204679.95142: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204679.95165: _low_level_execute_command(): starting 41175 1727204679.95176: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884 `" && echo ansible-tmp-1727204679.9515269-43138-8818414677884="` echo /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884 `" ) && sleep 0' 41175 1727204679.95813: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204679.95826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204679.95849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204679.95868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204679.95884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204679.95899: stderr chunk (state=3): >>>debug2: match not found <<< 41175 1727204679.95913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.95931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204679.96007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204679.96011: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.96065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204679.96086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204679.96151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204679.98203: stdout chunk (state=3): >>>ansible-tmp-1727204679.9515269-43138-8818414677884=/root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884 <<< 41175 1727204679.98398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204679.98418: stdout chunk (state=3): >>><<< 41175 1727204679.98433: stderr chunk (state=3): >>><<< 41175 1727204679.98456: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204679.9515269-43138-8818414677884=/root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204679.98606: variable 'ansible_module_compression' from source: unknown 41175 1727204679.98609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41175 1727204679.98626: variable 'ansible_facts' from source: unknown 41175 1727204679.98759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/AnsiballZ_network_connections.py 41175 1727204679.98973: Sending initial data 41175 1727204679.98976: Sent initial data (166 bytes) 41175 1727204679.99747: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204679.99795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204679.99845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204679.99987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204680.00020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.00094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.01765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204680.01799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204680.01838: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpnkykdq5s /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/AnsiballZ_network_connections.py <<< 41175 1727204680.01841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/AnsiballZ_network_connections.py" <<< 41175 1727204680.01880: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpnkykdq5s" to remote "/root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/AnsiballZ_network_connections.py" <<< 41175 1727204680.03344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.03453: stderr chunk (state=3): >>><<< 41175 1727204680.03456: stdout chunk (state=3): >>><<< 41175 1727204680.03459: done transferring module to remote 41175 1727204680.03461: _low_level_execute_command(): starting 41175 1727204680.03464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/ /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/AnsiballZ_network_connections.py && sleep 0' 41175 1727204680.03976: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.03992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204680.04019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.04064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204680.04068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.04108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.06229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.06233: stdout chunk (state=3): >>><<< 41175 1727204680.06236: stderr chunk (state=3): >>><<< 41175 1727204680.06238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204680.06241: _low_level_execute_command(): starting 41175 1727204680.06243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/AnsiballZ_network_connections.py && sleep 0' 41175 1727204680.06814: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.06821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204680.06900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.06904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.06919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.06980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.37147: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 41175 1727204680.37187: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h677ua_i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h677ua_i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/270b38fe-b70b-4444-a0b4-74394ad4b2b5: error=unknown <<< 41175 1727204680.37323: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41175 1727204680.39401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.39436: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 41175 1727204680.39440: stdout chunk (state=3): >>><<< 41175 1727204680.39442: stderr chunk (state=3): >>><<< 41175 1727204680.39461: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h677ua_i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h677ua_i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/270b38fe-b70b-4444-a0b4-74394ad4b2b5: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204680.39599: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204680.39602: _low_level_execute_command(): starting 41175 1727204680.39605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204679.9515269-43138-8818414677884/ > /dev/null 2>&1 && sleep 0' 41175 1727204680.40213: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204680.40232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204680.40257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.40370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204680.40401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.40460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.42405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.42504: stderr chunk (state=3): >>><<< 41175 1727204680.42507: stdout chunk (state=3): >>><<< 41175 1727204680.42513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204680.42523: handler run complete 41175 1727204680.42560: attempt loop complete, returning result 41175 1727204680.42563: _execute() done 41175 1727204680.42566: dumping result to json 41175 1727204680.42574: done dumping result, returning 41175 1727204680.42694: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-f070-39c4-0000000000e7] 41175 1727204680.42698: sending task result for task 12b410aa-8751-f070-39c4-0000000000e7 41175 1727204680.42775: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e7 41175 1727204680.42778: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 41175 1727204680.43028: no more pending results, returning what we have 41175 1727204680.43033: results queue empty 41175 1727204680.43034: checking for any_errors_fatal 41175 1727204680.43043: done checking for any_errors_fatal 41175 1727204680.43044: checking for max_fail_percentage 41175 1727204680.43046: done checking for max_fail_percentage 41175 1727204680.43047: checking to see if all hosts have failed and the running result is not ok 41175 1727204680.43049: done checking to see if all hosts have failed 41175 1727204680.43049: getting the remaining hosts for this loop 41175 1727204680.43051: done getting the remaining hosts for this loop 41175 1727204680.43056: getting the next task for host managed-node3 41175 1727204680.43062: done getting next task for host managed-node3 41175 1727204680.43066: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204680.43068: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204680.43080: getting variables 41175 1727204680.43082: in VariableManager get_vars() 41175 1727204680.43237: Calling all_inventory to load vars for managed-node3 41175 1727204680.43240: Calling groups_inventory to load vars for managed-node3 41175 1727204680.43244: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204680.43256: Calling all_plugins_play to load vars for managed-node3 41175 1727204680.43260: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204680.43264: Calling groups_plugins_play to load vars for managed-node3 41175 1727204680.44899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204680.47078: done with get_vars() 41175 1727204680.47108: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.680) 0:00:47.610 ***** 41175 1727204680.47175: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204680.47426: worker is 1 (out of 1 available) 41175 1727204680.47441: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41175 1727204680.47454: done queuing things up, now waiting for results queue to drain 41175 1727204680.47456: waiting for pending results... 41175 1727204680.47655: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41175 1727204680.47742: in run() - task 12b410aa-8751-f070-39c4-0000000000e8 41175 1727204680.47757: variable 'ansible_search_path' from source: unknown 41175 1727204680.47760: variable 'ansible_search_path' from source: unknown 41175 1727204680.47798: calling self._execute() 41175 1727204680.47883: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.47892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.47904: variable 'omit' from source: magic vars 41175 1727204680.48233: variable 'ansible_distribution_major_version' from source: facts 41175 1727204680.48246: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204680.48354: variable 'network_state' from source: role '' defaults 41175 1727204680.48368: Evaluated conditional (network_state != {}): False 41175 1727204680.48372: when evaluation is False, skipping this task 41175 1727204680.48374: _execute() done 41175 1727204680.48377: dumping result to json 41175 1727204680.48380: done dumping result, returning 41175 1727204680.48392: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-f070-39c4-0000000000e8] 41175 1727204680.48398: sending task result for task 12b410aa-8751-f070-39c4-0000000000e8 41175 1727204680.48494: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e8 41175 1727204680.48498: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41175 1727204680.48553: no more pending results, returning what we have 41175 1727204680.48558: results queue empty 41175 1727204680.48559: checking for any_errors_fatal 41175 1727204680.48571: done checking for any_errors_fatal 41175 1727204680.48572: checking for max_fail_percentage 41175 1727204680.48574: done checking for max_fail_percentage 41175 1727204680.48575: checking to see if all hosts have failed and the running result is not ok 41175 1727204680.48576: done checking to see if all hosts have failed 41175 1727204680.48577: getting the remaining hosts for this loop 41175 1727204680.48579: done getting the remaining hosts for this loop 41175 1727204680.48583: getting the next task for host managed-node3 41175 1727204680.48592: done getting next task for host managed-node3 41175 1727204680.48596: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204680.48599: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204680.48613: getting variables 41175 1727204680.48615: in VariableManager get_vars() 41175 1727204680.48650: Calling all_inventory to load vars for managed-node3 41175 1727204680.48652: Calling groups_inventory to load vars for managed-node3 41175 1727204680.48655: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204680.48665: Calling all_plugins_play to load vars for managed-node3 41175 1727204680.48668: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204680.48671: Calling groups_plugins_play to load vars for managed-node3 41175 1727204680.50585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204680.52383: done with get_vars() 41175 1727204680.52419: done getting variables 41175 1727204680.52505: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.053) 0:00:47.664 ***** 41175 1727204680.52540: entering _queue_task() for managed-node3/debug 41175 1727204680.52868: worker is 1 (out of 1 available) 41175 1727204680.52883: exiting _queue_task() for managed-node3/debug 41175 1727204680.52900: done queuing things up, now waiting for results queue to drain 41175 1727204680.52902: waiting for pending results... 41175 1727204680.53213: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41175 1727204680.53494: in run() - task 12b410aa-8751-f070-39c4-0000000000e9 41175 1727204680.53498: variable 'ansible_search_path' from source: unknown 41175 1727204680.53501: variable 'ansible_search_path' from source: unknown 41175 1727204680.53504: calling self._execute() 41175 1727204680.53507: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.53510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.53565: variable 'omit' from source: magic vars 41175 1727204680.54065: variable 'ansible_distribution_major_version' from source: facts 41175 1727204680.54096: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204680.54127: variable 'omit' from source: magic vars 41175 1727204680.54228: variable 'omit' from source: magic vars 41175 1727204680.54268: variable 'omit' from source: magic vars 41175 1727204680.54312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204680.54344: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204680.54364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204680.54385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204680.54403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204680.54433: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204680.54437: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.54439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.54532: Set connection var ansible_shell_executable to /bin/sh 41175 1727204680.54536: Set connection var ansible_shell_type to sh 41175 1727204680.54539: Set connection var ansible_pipelining to False 41175 1727204680.54548: Set connection var ansible_timeout to 10 41175 1727204680.54554: Set connection var ansible_connection to ssh 41175 1727204680.54560: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204680.54580: variable 'ansible_shell_executable' from source: unknown 41175 1727204680.54583: variable 'ansible_connection' from source: unknown 41175 1727204680.54586: variable 'ansible_module_compression' from source: unknown 41175 1727204680.54592: variable 'ansible_shell_type' from source: unknown 41175 1727204680.54596: variable 'ansible_shell_executable' from source: unknown 41175 1727204680.54599: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.54604: variable 'ansible_pipelining' from source: unknown 41175 1727204680.54608: variable 'ansible_timeout' from source: unknown 41175 1727204680.54614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.54736: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204680.54751: variable 'omit' from source: magic vars 41175 1727204680.54758: starting attempt loop 41175 1727204680.54761: running the handler 41175 1727204680.54875: variable '__network_connections_result' from source: set_fact 41175 1727204680.54924: handler run complete 41175 1727204680.54941: attempt loop complete, returning result 41175 1727204680.54944: _execute() done 41175 1727204680.54948: dumping result to json 41175 1727204680.54951: done dumping result, returning 41175 1727204680.54966: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-f070-39c4-0000000000e9] 41175 1727204680.54969: sending task result for task 12b410aa-8751-f070-39c4-0000000000e9 41175 1727204680.55063: done sending task result for task 12b410aa-8751-f070-39c4-0000000000e9 41175 1727204680.55066: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 41175 1727204680.55142: no more pending results, returning what we have 41175 1727204680.55146: results queue empty 41175 1727204680.55147: checking for any_errors_fatal 41175 1727204680.55156: done checking for any_errors_fatal 41175 1727204680.55157: checking for max_fail_percentage 41175 1727204680.55159: done checking for max_fail_percentage 41175 1727204680.55160: checking to see if all hosts have failed and the running result is not ok 41175 1727204680.55162: done checking to see if all hosts have failed 41175 1727204680.55163: getting the remaining hosts for this loop 41175 1727204680.55164: done getting the remaining hosts for this loop 41175 1727204680.55169: getting the next task for host managed-node3 41175 1727204680.55175: done getting next task for host managed-node3 41175 1727204680.55181: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204680.55183: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204680.55195: getting variables 41175 1727204680.55197: in VariableManager get_vars() 41175 1727204680.55241: Calling all_inventory to load vars for managed-node3 41175 1727204680.55245: Calling groups_inventory to load vars for managed-node3 41175 1727204680.55247: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204680.55257: Calling all_plugins_play to load vars for managed-node3 41175 1727204680.55261: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204680.55264: Calling groups_plugins_play to load vars for managed-node3 41175 1727204680.57242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204680.58869: done with get_vars() 41175 1727204680.58896: done getting variables 41175 1727204680.58944: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.064) 0:00:47.728 ***** 41175 1727204680.58969: entering _queue_task() for managed-node3/debug 41175 1727204680.59217: worker is 1 (out of 1 available) 41175 1727204680.59232: exiting _queue_task() for managed-node3/debug 41175 1727204680.59245: done queuing things up, now waiting for results queue to drain 41175 1727204680.59247: waiting for pending results... 41175 1727204680.59444: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41175 1727204680.59528: in run() - task 12b410aa-8751-f070-39c4-0000000000ea 41175 1727204680.59542: variable 'ansible_search_path' from source: unknown 41175 1727204680.59546: variable 'ansible_search_path' from source: unknown 41175 1727204680.59581: calling self._execute() 41175 1727204680.59667: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.59672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.59683: variable 'omit' from source: magic vars 41175 1727204680.60017: variable 'ansible_distribution_major_version' from source: facts 41175 1727204680.60034: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204680.60043: variable 'omit' from source: magic vars 41175 1727204680.60072: variable 'omit' from source: magic vars 41175 1727204680.60106: variable 'omit' from source: magic vars 41175 1727204680.60146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204680.60177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204680.60197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204680.60213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204680.60228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204680.60259: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204680.60263: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.60266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.60357: Set connection var ansible_shell_executable to /bin/sh 41175 1727204680.60360: Set connection var ansible_shell_type to sh 41175 1727204680.60364: Set connection var ansible_pipelining to False 41175 1727204680.60374: Set connection var ansible_timeout to 10 41175 1727204680.60381: Set connection var ansible_connection to ssh 41175 1727204680.60388: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204680.60410: variable 'ansible_shell_executable' from source: unknown 41175 1727204680.60413: variable 'ansible_connection' from source: unknown 41175 1727204680.60416: variable 'ansible_module_compression' from source: unknown 41175 1727204680.60423: variable 'ansible_shell_type' from source: unknown 41175 1727204680.60427: variable 'ansible_shell_executable' from source: unknown 41175 1727204680.60431: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.60436: variable 'ansible_pipelining' from source: unknown 41175 1727204680.60440: variable 'ansible_timeout' from source: unknown 41175 1727204680.60445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.60567: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204680.60577: variable 'omit' from source: magic vars 41175 1727204680.60584: starting attempt loop 41175 1727204680.60587: running the handler 41175 1727204680.60636: variable '__network_connections_result' from source: set_fact 41175 1727204680.60707: variable '__network_connections_result' from source: set_fact 41175 1727204680.60799: handler run complete 41175 1727204680.60826: attempt loop complete, returning result 41175 1727204680.60829: _execute() done 41175 1727204680.60832: dumping result to json 41175 1727204680.60837: done dumping result, returning 41175 1727204680.60846: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-f070-39c4-0000000000ea] 41175 1727204680.60852: sending task result for task 12b410aa-8751-f070-39c4-0000000000ea 41175 1727204680.60948: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ea 41175 1727204680.60951: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 41175 1727204680.61041: no more pending results, returning what we have 41175 1727204680.61045: results queue empty 41175 1727204680.61046: checking for any_errors_fatal 41175 1727204680.61053: done checking for any_errors_fatal 41175 1727204680.61054: checking for max_fail_percentage 41175 1727204680.61056: done checking for max_fail_percentage 41175 1727204680.61057: checking to see if all hosts have failed and the running result is not ok 41175 1727204680.61058: done checking to see if all hosts have failed 41175 1727204680.61059: getting the remaining hosts for this loop 41175 1727204680.61060: done getting the remaining hosts for this loop 41175 1727204680.61064: getting the next task for host managed-node3 41175 1727204680.61070: done getting next task for host managed-node3 41175 1727204680.61073: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204680.61075: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204680.61085: getting variables 41175 1727204680.61087: in VariableManager get_vars() 41175 1727204680.61130: Calling all_inventory to load vars for managed-node3 41175 1727204680.61133: Calling groups_inventory to load vars for managed-node3 41175 1727204680.61136: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204680.61146: Calling all_plugins_play to load vars for managed-node3 41175 1727204680.61150: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204680.61154: Calling groups_plugins_play to load vars for managed-node3 41175 1727204680.62490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204680.64096: done with get_vars() 41175 1727204680.64119: done getting variables 41175 1727204680.64166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.052) 0:00:47.780 ***** 41175 1727204680.64196: entering _queue_task() for managed-node3/debug 41175 1727204680.64423: worker is 1 (out of 1 available) 41175 1727204680.64439: exiting _queue_task() for managed-node3/debug 41175 1727204680.64451: done queuing things up, now waiting for results queue to drain 41175 1727204680.64453: waiting for pending results... 41175 1727204680.64646: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41175 1727204680.64733: in run() - task 12b410aa-8751-f070-39c4-0000000000eb 41175 1727204680.64747: variable 'ansible_search_path' from source: unknown 41175 1727204680.64751: variable 'ansible_search_path' from source: unknown 41175 1727204680.64784: calling self._execute() 41175 1727204680.64866: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.64871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.64881: variable 'omit' from source: magic vars 41175 1727204680.65204: variable 'ansible_distribution_major_version' from source: facts 41175 1727204680.65215: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204680.65324: variable 'network_state' from source: role '' defaults 41175 1727204680.65334: Evaluated conditional (network_state != {}): False 41175 1727204680.65339: when evaluation is False, skipping this task 41175 1727204680.65342: _execute() done 41175 1727204680.65345: dumping result to json 41175 1727204680.65354: done dumping result, returning 41175 1727204680.65358: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-f070-39c4-0000000000eb] 41175 1727204680.65367: sending task result for task 12b410aa-8751-f070-39c4-0000000000eb 41175 1727204680.65460: done sending task result for task 12b410aa-8751-f070-39c4-0000000000eb 41175 1727204680.65463: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41175 1727204680.65515: no more pending results, returning what we have 41175 1727204680.65520: results queue empty 41175 1727204680.65521: checking for any_errors_fatal 41175 1727204680.65528: done checking for any_errors_fatal 41175 1727204680.65529: checking for max_fail_percentage 41175 1727204680.65531: done checking for max_fail_percentage 41175 1727204680.65532: checking to see if all hosts have failed and the running result is not ok 41175 1727204680.65533: done checking to see if all hosts have failed 41175 1727204680.65534: getting the remaining hosts for this loop 41175 1727204680.65536: done getting the remaining hosts for this loop 41175 1727204680.65540: getting the next task for host managed-node3 41175 1727204680.65545: done getting next task for host managed-node3 41175 1727204680.65550: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204680.65552: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204680.65567: getting variables 41175 1727204680.65568: in VariableManager get_vars() 41175 1727204680.65610: Calling all_inventory to load vars for managed-node3 41175 1727204680.65613: Calling groups_inventory to load vars for managed-node3 41175 1727204680.65616: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204680.65626: Calling all_plugins_play to load vars for managed-node3 41175 1727204680.65629: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204680.65633: Calling groups_plugins_play to load vars for managed-node3 41175 1727204680.66839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204680.68461: done with get_vars() 41175 1727204680.68484: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.043) 0:00:47.824 ***** 41175 1727204680.68562: entering _queue_task() for managed-node3/ping 41175 1727204680.68801: worker is 1 (out of 1 available) 41175 1727204680.68815: exiting _queue_task() for managed-node3/ping 41175 1727204680.68827: done queuing things up, now waiting for results queue to drain 41175 1727204680.68829: waiting for pending results... 41175 1727204680.69021: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41175 1727204680.69104: in run() - task 12b410aa-8751-f070-39c4-0000000000ec 41175 1727204680.69117: variable 'ansible_search_path' from source: unknown 41175 1727204680.69121: variable 'ansible_search_path' from source: unknown 41175 1727204680.69156: calling self._execute() 41175 1727204680.69243: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.69249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.69260: variable 'omit' from source: magic vars 41175 1727204680.69597: variable 'ansible_distribution_major_version' from source: facts 41175 1727204680.69610: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204680.69617: variable 'omit' from source: magic vars 41175 1727204680.69653: variable 'omit' from source: magic vars 41175 1727204680.69684: variable 'omit' from source: magic vars 41175 1727204680.69726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204680.69756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204680.69774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204680.69792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204680.69803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204680.69837: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204680.69840: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.69844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.69938: Set connection var ansible_shell_executable to /bin/sh 41175 1727204680.69941: Set connection var ansible_shell_type to sh 41175 1727204680.69944: Set connection var ansible_pipelining to False 41175 1727204680.69954: Set connection var ansible_timeout to 10 41175 1727204680.69960: Set connection var ansible_connection to ssh 41175 1727204680.69967: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204680.69987: variable 'ansible_shell_executable' from source: unknown 41175 1727204680.69992: variable 'ansible_connection' from source: unknown 41175 1727204680.69995: variable 'ansible_module_compression' from source: unknown 41175 1727204680.69999: variable 'ansible_shell_type' from source: unknown 41175 1727204680.70002: variable 'ansible_shell_executable' from source: unknown 41175 1727204680.70006: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204680.70012: variable 'ansible_pipelining' from source: unknown 41175 1727204680.70016: variable 'ansible_timeout' from source: unknown 41175 1727204680.70023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204680.70200: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204680.70211: variable 'omit' from source: magic vars 41175 1727204680.70217: starting attempt loop 41175 1727204680.70224: running the handler 41175 1727204680.70238: _low_level_execute_command(): starting 41175 1727204680.70248: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204680.70795: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.70799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.70803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.70806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.70865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204680.70872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204680.70875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.70917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.72680: stdout chunk (state=3): >>>/root <<< 41175 1727204680.72786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.72840: stderr chunk (state=3): >>><<< 41175 1727204680.72844: stdout chunk (state=3): >>><<< 41175 1727204680.72870: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204680.72883: _low_level_execute_command(): starting 41175 1727204680.72891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207 `" && echo ansible-tmp-1727204680.728692-43173-101706303255207="` echo /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207 `" ) && sleep 0' 41175 1727204680.73372: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.73376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204680.73379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204680.73388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204680.73393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.73445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204680.73452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204680.73453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.73488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.75467: stdout chunk (state=3): >>>ansible-tmp-1727204680.728692-43173-101706303255207=/root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207 <<< 41175 1727204680.75578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.75627: stderr chunk (state=3): >>><<< 41175 1727204680.75631: stdout chunk (state=3): >>><<< 41175 1727204680.75647: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204680.728692-43173-101706303255207=/root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204680.75689: variable 'ansible_module_compression' from source: unknown 41175 1727204680.75725: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41175 1727204680.75756: variable 'ansible_facts' from source: unknown 41175 1727204680.75820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/AnsiballZ_ping.py 41175 1727204680.75924: Sending initial data 41175 1727204680.75927: Sent initial data (152 bytes) 41175 1727204680.76379: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204680.76382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204680.76384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204680.76387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.76450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204680.76453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.76484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.78077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204680.78107: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204680.78144: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpstusl2v2 /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/AnsiballZ_ping.py <<< 41175 1727204680.78148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/AnsiballZ_ping.py" <<< 41175 1727204680.78176: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpstusl2v2" to remote "/root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/AnsiballZ_ping.py" <<< 41175 1727204680.78908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.78974: stderr chunk (state=3): >>><<< 41175 1727204680.78977: stdout chunk (state=3): >>><<< 41175 1727204680.78999: done transferring module to remote 41175 1727204680.79009: _low_level_execute_command(): starting 41175 1727204680.79015: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/ /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/AnsiballZ_ping.py && sleep 0' 41175 1727204680.79467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204680.79470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.79473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204680.79475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.79537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204680.79544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.79579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.81492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204680.81495: stdout chunk (state=3): >>><<< 41175 1727204680.81498: stderr chunk (state=3): >>><<< 41175 1727204680.81610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204680.81613: _low_level_execute_command(): starting 41175 1727204680.81618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/AnsiballZ_ping.py && sleep 0' 41175 1727204680.82247: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204680.82291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204680.82308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204680.82332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204680.82411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204680.99534: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41175 1727204681.01049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204681.01196: stdout chunk (state=3): >>><<< 41175 1727204681.01200: stderr chunk (state=3): >>><<< 41175 1727204681.01203: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204681.01205: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204681.01208: _low_level_execute_command(): starting 41175 1727204681.01210: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204680.728692-43173-101706303255207/ > /dev/null 2>&1 && sleep 0' 41175 1727204681.01833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204681.01849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204681.01868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204681.01898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204681.02000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204681.02026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204681.02045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204681.02066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204681.02127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204681.04120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204681.04131: stdout chunk (state=3): >>><<< 41175 1727204681.04142: stderr chunk (state=3): >>><<< 41175 1727204681.04162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204681.04178: handler run complete 41175 1727204681.04206: attempt loop complete, returning result 41175 1727204681.04395: _execute() done 41175 1727204681.04398: dumping result to json 41175 1727204681.04401: done dumping result, returning 41175 1727204681.04403: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-f070-39c4-0000000000ec] 41175 1727204681.04405: sending task result for task 12b410aa-8751-f070-39c4-0000000000ec 41175 1727204681.04480: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ec 41175 1727204681.04483: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 41175 1727204681.04559: no more pending results, returning what we have 41175 1727204681.04564: results queue empty 41175 1727204681.04565: checking for any_errors_fatal 41175 1727204681.04575: done checking for any_errors_fatal 41175 1727204681.04577: checking for max_fail_percentage 41175 1727204681.04579: done checking for max_fail_percentage 41175 1727204681.04580: checking to see if all hosts have failed and the running result is not ok 41175 1727204681.04581: done checking to see if all hosts have failed 41175 1727204681.04582: getting the remaining hosts for this loop 41175 1727204681.04584: done getting the remaining hosts for this loop 41175 1727204681.04590: getting the next task for host managed-node3 41175 1727204681.04704: done getting next task for host managed-node3 41175 1727204681.04707: ^ task is: TASK: meta (role_complete) 41175 1727204681.04710: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204681.04723: getting variables 41175 1727204681.04724: in VariableManager get_vars() 41175 1727204681.04769: Calling all_inventory to load vars for managed-node3 41175 1727204681.04772: Calling groups_inventory to load vars for managed-node3 41175 1727204681.04775: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204681.04788: Calling all_plugins_play to load vars for managed-node3 41175 1727204681.04910: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204681.04915: Calling groups_plugins_play to load vars for managed-node3 41175 1727204681.07323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204681.10275: done with get_vars() 41175 1727204681.10327: done getting variables 41175 1727204681.10429: done queuing things up, now waiting for results queue to drain 41175 1727204681.10432: results queue empty 41175 1727204681.10433: checking for any_errors_fatal 41175 1727204681.10437: done checking for any_errors_fatal 41175 1727204681.10438: checking for max_fail_percentage 41175 1727204681.10439: done checking for max_fail_percentage 41175 1727204681.10440: checking to see if all hosts have failed and the running result is not ok 41175 1727204681.10441: done checking to see if all hosts have failed 41175 1727204681.10442: getting the remaining hosts for this loop 41175 1727204681.10444: done getting the remaining hosts for this loop 41175 1727204681.10447: getting the next task for host managed-node3 41175 1727204681.10452: done getting next task for host managed-node3 41175 1727204681.10454: ^ task is: TASK: meta (flush_handlers) 41175 1727204681.10456: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204681.10460: getting variables 41175 1727204681.10461: in VariableManager get_vars() 41175 1727204681.10476: Calling all_inventory to load vars for managed-node3 41175 1727204681.10479: Calling groups_inventory to load vars for managed-node3 41175 1727204681.10482: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204681.10488: Calling all_plugins_play to load vars for managed-node3 41175 1727204681.10494: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204681.10498: Calling groups_plugins_play to load vars for managed-node3 41175 1727204681.12612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204681.17341: done with get_vars() 41175 1727204681.17501: done getting variables 41175 1727204681.17565: in VariableManager get_vars() 41175 1727204681.17582: Calling all_inventory to load vars for managed-node3 41175 1727204681.17585: Calling groups_inventory to load vars for managed-node3 41175 1727204681.17588: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204681.17701: Calling all_plugins_play to load vars for managed-node3 41175 1727204681.17705: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204681.17709: Calling groups_plugins_play to load vars for managed-node3 41175 1727204681.21701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204681.24860: done with get_vars() 41175 1727204681.24914: done queuing things up, now waiting for results queue to drain 41175 1727204681.24917: results queue empty 41175 1727204681.24918: checking for any_errors_fatal 41175 1727204681.24919: done checking for any_errors_fatal 41175 1727204681.24920: checking for max_fail_percentage 41175 1727204681.24922: done checking for max_fail_percentage 41175 1727204681.24923: checking to see if all hosts have failed and the running result is not ok 41175 1727204681.24924: done checking to see if all hosts have failed 41175 1727204681.24925: getting the remaining hosts for this loop 41175 1727204681.24927: done getting the remaining hosts for this loop 41175 1727204681.24930: getting the next task for host managed-node3 41175 1727204681.24935: done getting next task for host managed-node3 41175 1727204681.24937: ^ task is: TASK: meta (flush_handlers) 41175 1727204681.24939: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204681.24942: getting variables 41175 1727204681.24943: in VariableManager get_vars() 41175 1727204681.24958: Calling all_inventory to load vars for managed-node3 41175 1727204681.24961: Calling groups_inventory to load vars for managed-node3 41175 1727204681.24964: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204681.24971: Calling all_plugins_play to load vars for managed-node3 41175 1727204681.24974: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204681.24978: Calling groups_plugins_play to load vars for managed-node3 41175 1727204681.32161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204681.35200: done with get_vars() 41175 1727204681.35240: done getting variables 41175 1727204681.35308: in VariableManager get_vars() 41175 1727204681.35323: Calling all_inventory to load vars for managed-node3 41175 1727204681.35327: Calling groups_inventory to load vars for managed-node3 41175 1727204681.35329: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204681.35336: Calling all_plugins_play to load vars for managed-node3 41175 1727204681.35339: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204681.35343: Calling groups_plugins_play to load vars for managed-node3 41175 1727204681.37238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204681.40277: done with get_vars() 41175 1727204681.40321: done queuing things up, now waiting for results queue to drain 41175 1727204681.40324: results queue empty 41175 1727204681.40325: checking for any_errors_fatal 41175 1727204681.40326: done checking for any_errors_fatal 41175 1727204681.40327: checking for max_fail_percentage 41175 1727204681.40329: done checking for max_fail_percentage 41175 1727204681.40330: checking to see if all hosts have failed and the running result is not ok 41175 1727204681.40331: done checking to see if all hosts have failed 41175 1727204681.40332: getting the remaining hosts for this loop 41175 1727204681.40333: done getting the remaining hosts for this loop 41175 1727204681.40336: getting the next task for host managed-node3 41175 1727204681.40340: done getting next task for host managed-node3 41175 1727204681.40341: ^ task is: None 41175 1727204681.40343: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204681.40345: done queuing things up, now waiting for results queue to drain 41175 1727204681.40346: results queue empty 41175 1727204681.40347: checking for any_errors_fatal 41175 1727204681.40349: done checking for any_errors_fatal 41175 1727204681.40350: checking for max_fail_percentage 41175 1727204681.40351: done checking for max_fail_percentage 41175 1727204681.40352: checking to see if all hosts have failed and the running result is not ok 41175 1727204681.40353: done checking to see if all hosts have failed 41175 1727204681.40354: getting the next task for host managed-node3 41175 1727204681.40357: done getting next task for host managed-node3 41175 1727204681.40359: ^ task is: None 41175 1727204681.40360: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204681.40404: in VariableManager get_vars() 41175 1727204681.40422: done with get_vars() 41175 1727204681.40429: in VariableManager get_vars() 41175 1727204681.40440: done with get_vars() 41175 1727204681.40444: variable 'omit' from source: magic vars 41175 1727204681.40479: in VariableManager get_vars() 41175 1727204681.40494: done with get_vars() 41175 1727204681.40519: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 41175 1727204681.40805: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41175 1727204681.40826: getting the remaining hosts for this loop 41175 1727204681.40828: done getting the remaining hosts for this loop 41175 1727204681.40831: getting the next task for host managed-node3 41175 1727204681.40833: done getting next task for host managed-node3 41175 1727204681.40835: ^ task is: TASK: Gathering Facts 41175 1727204681.40837: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204681.40839: getting variables 41175 1727204681.40840: in VariableManager get_vars() 41175 1727204681.40849: Calling all_inventory to load vars for managed-node3 41175 1727204681.40852: Calling groups_inventory to load vars for managed-node3 41175 1727204681.40854: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204681.40860: Calling all_plugins_play to load vars for managed-node3 41175 1727204681.40863: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204681.40866: Calling groups_plugins_play to load vars for managed-node3 41175 1727204681.42911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204681.45938: done with get_vars() 41175 1727204681.45974: done getting variables 41175 1727204681.46032: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 Tuesday 24 September 2024 15:04:41 -0400 (0:00:00.774) 0:00:48.599 ***** 41175 1727204681.46060: entering _queue_task() for managed-node3/gather_facts 41175 1727204681.46431: worker is 1 (out of 1 available) 41175 1727204681.46444: exiting _queue_task() for managed-node3/gather_facts 41175 1727204681.46457: done queuing things up, now waiting for results queue to drain 41175 1727204681.46459: waiting for pending results... 41175 1727204681.46731: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41175 1727204681.46980: in run() - task 12b410aa-8751-f070-39c4-00000000085b 41175 1727204681.46984: variable 'ansible_search_path' from source: unknown 41175 1727204681.46987: calling self._execute() 41175 1727204681.47068: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204681.47082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204681.47102: variable 'omit' from source: magic vars 41175 1727204681.47674: variable 'ansible_distribution_major_version' from source: facts 41175 1727204681.47784: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204681.47789: variable 'omit' from source: magic vars 41175 1727204681.47792: variable 'omit' from source: magic vars 41175 1727204681.47836: variable 'omit' from source: magic vars 41175 1727204681.47928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204681.47997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204681.48034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204681.48119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204681.48123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204681.48220: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204681.48224: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204681.48227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204681.48313: Set connection var ansible_shell_executable to /bin/sh 41175 1727204681.48331: Set connection var ansible_shell_type to sh 41175 1727204681.48352: Set connection var ansible_pipelining to False 41175 1727204681.48369: Set connection var ansible_timeout to 10 41175 1727204681.48380: Set connection var ansible_connection to ssh 41175 1727204681.48394: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204681.48435: variable 'ansible_shell_executable' from source: unknown 41175 1727204681.48446: variable 'ansible_connection' from source: unknown 41175 1727204681.48459: variable 'ansible_module_compression' from source: unknown 41175 1727204681.48468: variable 'ansible_shell_type' from source: unknown 41175 1727204681.48508: variable 'ansible_shell_executable' from source: unknown 41175 1727204681.48511: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204681.48523: variable 'ansible_pipelining' from source: unknown 41175 1727204681.48526: variable 'ansible_timeout' from source: unknown 41175 1727204681.48529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204681.48790: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204681.48891: variable 'omit' from source: magic vars 41175 1727204681.48896: starting attempt loop 41175 1727204681.48899: running the handler 41175 1727204681.48901: variable 'ansible_facts' from source: unknown 41175 1727204681.48903: _low_level_execute_command(): starting 41175 1727204681.48906: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204681.49869: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204681.49893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204681.49952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204681.49991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204681.50066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204681.51864: stdout chunk (state=3): >>>/root <<< 41175 1727204681.52070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204681.52074: stdout chunk (state=3): >>><<< 41175 1727204681.52076: stderr chunk (state=3): >>><<< 41175 1727204681.52100: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204681.52126: _low_level_execute_command(): starting 41175 1727204681.52229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041 `" && echo ansible-tmp-1727204681.5210698-43195-245646432970041="` echo /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041 `" ) && sleep 0' 41175 1727204681.52810: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204681.52831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204681.52956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204681.52986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204681.53058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204681.55031: stdout chunk (state=3): >>>ansible-tmp-1727204681.5210698-43195-245646432970041=/root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041 <<< 41175 1727204681.55155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204681.55213: stderr chunk (state=3): >>><<< 41175 1727204681.55228: stdout chunk (state=3): >>><<< 41175 1727204681.55257: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204681.5210698-43195-245646432970041=/root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204681.55495: variable 'ansible_module_compression' from source: unknown 41175 1727204681.55499: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41175 1727204681.55502: variable 'ansible_facts' from source: unknown 41175 1727204681.55620: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/AnsiballZ_setup.py 41175 1727204681.55813: Sending initial data 41175 1727204681.55828: Sent initial data (154 bytes) 41175 1727204681.56511: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204681.56529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41175 1727204681.56612: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204681.56647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204681.56668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204681.56686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204681.56747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204681.58345: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204681.58408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204681.58450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/AnsiballZ_setup.py" <<< 41175 1727204681.58516: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmppzq1uxng /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/AnsiballZ_setup.py <<< 41175 1727204681.58521: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmppzq1uxng" to remote "/root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/AnsiballZ_setup.py" <<< 41175 1727204681.60526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204681.60690: stderr chunk (state=3): >>><<< 41175 1727204681.60694: stdout chunk (state=3): >>><<< 41175 1727204681.60697: done transferring module to remote 41175 1727204681.60700: _low_level_execute_command(): starting 41175 1727204681.60702: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/ /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/AnsiballZ_setup.py && sleep 0' 41175 1727204681.61067: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204681.61081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204681.61096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204681.61145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204681.61166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204681.61199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204681.63048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204681.63096: stderr chunk (state=3): >>><<< 41175 1727204681.63100: stdout chunk (state=3): >>><<< 41175 1727204681.63120: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204681.63127: _low_level_execute_command(): starting 41175 1727204681.63130: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/AnsiballZ_setup.py && sleep 0' 41175 1727204681.63554: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204681.63597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204681.63601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204681.63604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204681.63606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204681.63608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204681.63656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204681.63664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204681.63708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204682.33494: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2849, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 868, "free": 2849}, "nocache": {"free": 3480, "used": 237}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links":<<< 41175 1727204682.33546: stdout chunk (state=3): >>> {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1186, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148525568, "block_size": 4096, "block_total": 64479564, "block_available": 61315558, "block_used": 3164006, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]"<<< 41175 1727204682.33556: stdout chunk (state=3): >>>, "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off<<< 41175 1727204682.33579: stdout chunk (state=3): >>> [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_loadavg": {"1m": 1.13427734375, "5m": 0.91064453125, "15m": 0.5556640625}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "42", "epoch": "1727204682", "epoch_int": "1727204682", "date": "2024-09-24", "time": "15:04:42", "iso8601_micro": "2024-09-24T19:04:42.330955Z", "iso8601": "2024-09-24T19:04:42Z", "iso8601_basic": "20240924T150442330955", "iso8601_basic_short": "20240924T150442", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41175 1727204682.35596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204682.35664: stderr chunk (state=3): >>><<< 41175 1727204682.35667: stdout chunk (state=3): >>><<< 41175 1727204682.35707: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2849, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 868, "free": 2849}, "nocache": {"free": 3480, "used": 237}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1186, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148525568, "block_size": 4096, "block_total": 64479564, "block_available": 61315558, "block_used": 3164006, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_loadavg": {"1m": 1.13427734375, "5m": 0.91064453125, "15m": 0.5556640625}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "42", "epoch": "1727204682", "epoch_int": "1727204682", "date": "2024-09-24", "time": "15:04:42", "iso8601_micro": "2024-09-24T19:04:42.330955Z", "iso8601": "2024-09-24T19:04:42Z", "iso8601_basic": "20240924T150442330955", "iso8601_basic_short": "20240924T150442", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204682.36051: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204682.36073: _low_level_execute_command(): starting 41175 1727204682.36076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204681.5210698-43195-245646432970041/ > /dev/null 2>&1 && sleep 0' 41175 1727204682.36567: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204682.36571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.36574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204682.36576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.36631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204682.36639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204682.36677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204682.38585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204682.38642: stderr chunk (state=3): >>><<< 41175 1727204682.38646: stdout chunk (state=3): >>><<< 41175 1727204682.38660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204682.38670: handler run complete 41175 1727204682.38792: variable 'ansible_facts' from source: unknown 41175 1727204682.38893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.39176: variable 'ansible_facts' from source: unknown 41175 1727204682.39256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.39375: attempt loop complete, returning result 41175 1727204682.39381: _execute() done 41175 1727204682.39384: dumping result to json 41175 1727204682.39412: done dumping result, returning 41175 1727204682.39423: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-f070-39c4-00000000085b] 41175 1727204682.39429: sending task result for task 12b410aa-8751-f070-39c4-00000000085b ok: [managed-node3] 41175 1727204682.40231: no more pending results, returning what we have 41175 1727204682.40234: results queue empty 41175 1727204682.40235: checking for any_errors_fatal 41175 1727204682.40236: done checking for any_errors_fatal 41175 1727204682.40236: checking for max_fail_percentage 41175 1727204682.40238: done checking for max_fail_percentage 41175 1727204682.40238: checking to see if all hosts have failed and the running result is not ok 41175 1727204682.40239: done checking to see if all hosts have failed 41175 1727204682.40240: getting the remaining hosts for this loop 41175 1727204682.40241: done getting the remaining hosts for this loop 41175 1727204682.40243: getting the next task for host managed-node3 41175 1727204682.40247: done getting next task for host managed-node3 41175 1727204682.40249: ^ task is: TASK: meta (flush_handlers) 41175 1727204682.40250: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204682.40254: getting variables 41175 1727204682.40255: in VariableManager get_vars() 41175 1727204682.40273: Calling all_inventory to load vars for managed-node3 41175 1727204682.40276: Calling groups_inventory to load vars for managed-node3 41175 1727204682.40278: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.40291: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.40293: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.40297: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.40815: done sending task result for task 12b410aa-8751-f070-39c4-00000000085b 41175 1727204682.40820: WORKER PROCESS EXITING 41175 1727204682.41608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.43255: done with get_vars() 41175 1727204682.43287: done getting variables 41175 1727204682.43350: in VariableManager get_vars() 41175 1727204682.43359: Calling all_inventory to load vars for managed-node3 41175 1727204682.43361: Calling groups_inventory to load vars for managed-node3 41175 1727204682.43363: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.43367: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.43370: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.43374: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.44596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.46239: done with get_vars() 41175 1727204682.46268: done queuing things up, now waiting for results queue to drain 41175 1727204682.46270: results queue empty 41175 1727204682.46271: checking for any_errors_fatal 41175 1727204682.46275: done checking for any_errors_fatal 41175 1727204682.46275: checking for max_fail_percentage 41175 1727204682.46276: done checking for max_fail_percentage 41175 1727204682.46277: checking to see if all hosts have failed and the running result is not ok 41175 1727204682.46284: done checking to see if all hosts have failed 41175 1727204682.46285: getting the remaining hosts for this loop 41175 1727204682.46286: done getting the remaining hosts for this loop 41175 1727204682.46291: getting the next task for host managed-node3 41175 1727204682.46297: done getting next task for host managed-node3 41175 1727204682.46301: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 41175 1727204682.46302: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204682.46304: getting variables 41175 1727204682.46305: in VariableManager get_vars() 41175 1727204682.46314: Calling all_inventory to load vars for managed-node3 41175 1727204682.46316: Calling groups_inventory to load vars for managed-node3 41175 1727204682.46320: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.46325: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.46327: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.46329: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.47468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.49128: done with get_vars() 41175 1727204682.49150: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:152 Tuesday 24 September 2024 15:04:42 -0400 (0:00:01.031) 0:00:49.631 ***** 41175 1727204682.49218: entering _queue_task() for managed-node3/include_tasks 41175 1727204682.49498: worker is 1 (out of 1 available) 41175 1727204682.49513: exiting _queue_task() for managed-node3/include_tasks 41175 1727204682.49528: done queuing things up, now waiting for results queue to drain 41175 1727204682.49530: waiting for pending results... 41175 1727204682.49726: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' 41175 1727204682.49809: in run() - task 12b410aa-8751-f070-39c4-0000000000ef 41175 1727204682.49826: variable 'ansible_search_path' from source: unknown 41175 1727204682.49858: calling self._execute() 41175 1727204682.49945: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.49953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.49964: variable 'omit' from source: magic vars 41175 1727204682.50301: variable 'ansible_distribution_major_version' from source: facts 41175 1727204682.50321: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204682.50325: _execute() done 41175 1727204682.50329: dumping result to json 41175 1727204682.50332: done dumping result, returning 41175 1727204682.50337: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' [12b410aa-8751-f070-39c4-0000000000ef] 41175 1727204682.50344: sending task result for task 12b410aa-8751-f070-39c4-0000000000ef 41175 1727204682.50452: done sending task result for task 12b410aa-8751-f070-39c4-0000000000ef 41175 1727204682.50455: WORKER PROCESS EXITING 41175 1727204682.50486: no more pending results, returning what we have 41175 1727204682.50494: in VariableManager get_vars() 41175 1727204682.50531: Calling all_inventory to load vars for managed-node3 41175 1727204682.50535: Calling groups_inventory to load vars for managed-node3 41175 1727204682.50538: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.50554: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.50557: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.50561: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.51822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.53449: done with get_vars() 41175 1727204682.53472: variable 'ansible_search_path' from source: unknown 41175 1727204682.53485: we have included files to process 41175 1727204682.53486: generating all_blocks data 41175 1727204682.53487: done generating all_blocks data 41175 1727204682.53488: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41175 1727204682.53491: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41175 1727204682.53493: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41175 1727204682.53637: in VariableManager get_vars() 41175 1727204682.53651: done with get_vars() 41175 1727204682.53750: done processing included file 41175 1727204682.53752: iterating over new_blocks loaded from include file 41175 1727204682.53753: in VariableManager get_vars() 41175 1727204682.53762: done with get_vars() 41175 1727204682.53764: filtering new block on tags 41175 1727204682.53777: done filtering new block on tags 41175 1727204682.53779: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node3 41175 1727204682.53783: extending task lists for all hosts with included blocks 41175 1727204682.53842: done extending task lists 41175 1727204682.53843: done processing included files 41175 1727204682.53844: results queue empty 41175 1727204682.53844: checking for any_errors_fatal 41175 1727204682.53846: done checking for any_errors_fatal 41175 1727204682.53846: checking for max_fail_percentage 41175 1727204682.53847: done checking for max_fail_percentage 41175 1727204682.53848: checking to see if all hosts have failed and the running result is not ok 41175 1727204682.53849: done checking to see if all hosts have failed 41175 1727204682.53849: getting the remaining hosts for this loop 41175 1727204682.53850: done getting the remaining hosts for this loop 41175 1727204682.53852: getting the next task for host managed-node3 41175 1727204682.53855: done getting next task for host managed-node3 41175 1727204682.53857: ^ task is: TASK: Include the task 'get_profile_stat.yml' 41175 1727204682.53859: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204682.53860: getting variables 41175 1727204682.53861: in VariableManager get_vars() 41175 1727204682.53868: Calling all_inventory to load vars for managed-node3 41175 1727204682.53870: Calling groups_inventory to load vars for managed-node3 41175 1727204682.53872: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.53876: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.53878: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.53881: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.55074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.56686: done with get_vars() 41175 1727204682.56712: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.075) 0:00:49.706 ***** 41175 1727204682.56782: entering _queue_task() for managed-node3/include_tasks 41175 1727204682.57064: worker is 1 (out of 1 available) 41175 1727204682.57077: exiting _queue_task() for managed-node3/include_tasks 41175 1727204682.57091: done queuing things up, now waiting for results queue to drain 41175 1727204682.57093: waiting for pending results... 41175 1727204682.57292: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 41175 1727204682.57382: in run() - task 12b410aa-8751-f070-39c4-00000000086c 41175 1727204682.57398: variable 'ansible_search_path' from source: unknown 41175 1727204682.57402: variable 'ansible_search_path' from source: unknown 41175 1727204682.57442: calling self._execute() 41175 1727204682.57525: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.57538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.57547: variable 'omit' from source: magic vars 41175 1727204682.57891: variable 'ansible_distribution_major_version' from source: facts 41175 1727204682.57906: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204682.57913: _execute() done 41175 1727204682.57918: dumping result to json 41175 1727204682.57921: done dumping result, returning 41175 1727204682.57929: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-f070-39c4-00000000086c] 41175 1727204682.57936: sending task result for task 12b410aa-8751-f070-39c4-00000000086c 41175 1727204682.58033: done sending task result for task 12b410aa-8751-f070-39c4-00000000086c 41175 1727204682.58036: WORKER PROCESS EXITING 41175 1727204682.58071: no more pending results, returning what we have 41175 1727204682.58077: in VariableManager get_vars() 41175 1727204682.58114: Calling all_inventory to load vars for managed-node3 41175 1727204682.58120: Calling groups_inventory to load vars for managed-node3 41175 1727204682.58124: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.58140: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.58144: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.58150: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.59446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.61174: done with get_vars() 41175 1727204682.61196: variable 'ansible_search_path' from source: unknown 41175 1727204682.61198: variable 'ansible_search_path' from source: unknown 41175 1727204682.61233: we have included files to process 41175 1727204682.61234: generating all_blocks data 41175 1727204682.61235: done generating all_blocks data 41175 1727204682.61237: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41175 1727204682.61238: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41175 1727204682.61240: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41175 1727204682.62131: done processing included file 41175 1727204682.62133: iterating over new_blocks loaded from include file 41175 1727204682.62134: in VariableManager get_vars() 41175 1727204682.62145: done with get_vars() 41175 1727204682.62146: filtering new block on tags 41175 1727204682.62165: done filtering new block on tags 41175 1727204682.62167: in VariableManager get_vars() 41175 1727204682.62176: done with get_vars() 41175 1727204682.62177: filtering new block on tags 41175 1727204682.62194: done filtering new block on tags 41175 1727204682.62195: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 41175 1727204682.62200: extending task lists for all hosts with included blocks 41175 1727204682.62282: done extending task lists 41175 1727204682.62283: done processing included files 41175 1727204682.62284: results queue empty 41175 1727204682.62284: checking for any_errors_fatal 41175 1727204682.62288: done checking for any_errors_fatal 41175 1727204682.62288: checking for max_fail_percentage 41175 1727204682.62291: done checking for max_fail_percentage 41175 1727204682.62291: checking to see if all hosts have failed and the running result is not ok 41175 1727204682.62292: done checking to see if all hosts have failed 41175 1727204682.62293: getting the remaining hosts for this loop 41175 1727204682.62294: done getting the remaining hosts for this loop 41175 1727204682.62296: getting the next task for host managed-node3 41175 1727204682.62299: done getting next task for host managed-node3 41175 1727204682.62300: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 41175 1727204682.62303: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204682.62304: getting variables 41175 1727204682.62305: in VariableManager get_vars() 41175 1727204682.62369: Calling all_inventory to load vars for managed-node3 41175 1727204682.62372: Calling groups_inventory to load vars for managed-node3 41175 1727204682.62374: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.62379: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.62381: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.62383: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.63474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.65097: done with get_vars() 41175 1727204682.65127: done getting variables 41175 1727204682.65168: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.084) 0:00:49.790 ***** 41175 1727204682.65196: entering _queue_task() for managed-node3/set_fact 41175 1727204682.65486: worker is 1 (out of 1 available) 41175 1727204682.65501: exiting _queue_task() for managed-node3/set_fact 41175 1727204682.65516: done queuing things up, now waiting for results queue to drain 41175 1727204682.65520: waiting for pending results... 41175 1727204682.65709: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 41175 1727204682.65808: in run() - task 12b410aa-8751-f070-39c4-00000000087b 41175 1727204682.65862: variable 'ansible_search_path' from source: unknown 41175 1727204682.65866: variable 'ansible_search_path' from source: unknown 41175 1727204682.65869: calling self._execute() 41175 1727204682.65945: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.65952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.65968: variable 'omit' from source: magic vars 41175 1727204682.66291: variable 'ansible_distribution_major_version' from source: facts 41175 1727204682.66307: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204682.66315: variable 'omit' from source: magic vars 41175 1727204682.66358: variable 'omit' from source: magic vars 41175 1727204682.66388: variable 'omit' from source: magic vars 41175 1727204682.66428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204682.66461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204682.66480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204682.66499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204682.66511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204682.66542: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204682.66546: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.66551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.66638: Set connection var ansible_shell_executable to /bin/sh 41175 1727204682.66642: Set connection var ansible_shell_type to sh 41175 1727204682.66650: Set connection var ansible_pipelining to False 41175 1727204682.66659: Set connection var ansible_timeout to 10 41175 1727204682.66665: Set connection var ansible_connection to ssh 41175 1727204682.66671: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204682.66693: variable 'ansible_shell_executable' from source: unknown 41175 1727204682.66696: variable 'ansible_connection' from source: unknown 41175 1727204682.66699: variable 'ansible_module_compression' from source: unknown 41175 1727204682.66704: variable 'ansible_shell_type' from source: unknown 41175 1727204682.66707: variable 'ansible_shell_executable' from source: unknown 41175 1727204682.66711: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.66718: variable 'ansible_pipelining' from source: unknown 41175 1727204682.66722: variable 'ansible_timeout' from source: unknown 41175 1727204682.66725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.66849: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204682.66861: variable 'omit' from source: magic vars 41175 1727204682.66868: starting attempt loop 41175 1727204682.66871: running the handler 41175 1727204682.66883: handler run complete 41175 1727204682.66896: attempt loop complete, returning result 41175 1727204682.66899: _execute() done 41175 1727204682.66902: dumping result to json 41175 1727204682.66906: done dumping result, returning 41175 1727204682.66914: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-f070-39c4-00000000087b] 41175 1727204682.66958: sending task result for task 12b410aa-8751-f070-39c4-00000000087b 41175 1727204682.67032: done sending task result for task 12b410aa-8751-f070-39c4-00000000087b 41175 1727204682.67035: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 41175 1727204682.67124: no more pending results, returning what we have 41175 1727204682.67128: results queue empty 41175 1727204682.67130: checking for any_errors_fatal 41175 1727204682.67132: done checking for any_errors_fatal 41175 1727204682.67133: checking for max_fail_percentage 41175 1727204682.67134: done checking for max_fail_percentage 41175 1727204682.67135: checking to see if all hosts have failed and the running result is not ok 41175 1727204682.67136: done checking to see if all hosts have failed 41175 1727204682.67137: getting the remaining hosts for this loop 41175 1727204682.67139: done getting the remaining hosts for this loop 41175 1727204682.67143: getting the next task for host managed-node3 41175 1727204682.67150: done getting next task for host managed-node3 41175 1727204682.67153: ^ task is: TASK: Stat profile file 41175 1727204682.67157: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204682.67162: getting variables 41175 1727204682.67163: in VariableManager get_vars() 41175 1727204682.67191: Calling all_inventory to load vars for managed-node3 41175 1727204682.67194: Calling groups_inventory to load vars for managed-node3 41175 1727204682.67198: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204682.67209: Calling all_plugins_play to load vars for managed-node3 41175 1727204682.67212: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204682.67215: Calling groups_plugins_play to load vars for managed-node3 41175 1727204682.68570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204682.70207: done with get_vars() 41175 1727204682.70234: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.051) 0:00:49.842 ***** 41175 1727204682.70315: entering _queue_task() for managed-node3/stat 41175 1727204682.70590: worker is 1 (out of 1 available) 41175 1727204682.70605: exiting _queue_task() for managed-node3/stat 41175 1727204682.70622: done queuing things up, now waiting for results queue to drain 41175 1727204682.70624: waiting for pending results... 41175 1727204682.70810: running TaskExecutor() for managed-node3/TASK: Stat profile file 41175 1727204682.70908: in run() - task 12b410aa-8751-f070-39c4-00000000087c 41175 1727204682.70925: variable 'ansible_search_path' from source: unknown 41175 1727204682.70929: variable 'ansible_search_path' from source: unknown 41175 1727204682.70965: calling self._execute() 41175 1727204682.71050: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.71055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.71072: variable 'omit' from source: magic vars 41175 1727204682.71402: variable 'ansible_distribution_major_version' from source: facts 41175 1727204682.71413: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204682.71422: variable 'omit' from source: magic vars 41175 1727204682.71461: variable 'omit' from source: magic vars 41175 1727204682.71553: variable 'profile' from source: include params 41175 1727204682.71557: variable 'interface' from source: set_fact 41175 1727204682.71628: variable 'interface' from source: set_fact 41175 1727204682.71643: variable 'omit' from source: magic vars 41175 1727204682.71680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204682.71712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204682.71736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204682.71753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204682.71765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204682.71793: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204682.71799: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.71804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.71894: Set connection var ansible_shell_executable to /bin/sh 41175 1727204682.71898: Set connection var ansible_shell_type to sh 41175 1727204682.71904: Set connection var ansible_pipelining to False 41175 1727204682.71913: Set connection var ansible_timeout to 10 41175 1727204682.71922: Set connection var ansible_connection to ssh 41175 1727204682.71928: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204682.71953: variable 'ansible_shell_executable' from source: unknown 41175 1727204682.71956: variable 'ansible_connection' from source: unknown 41175 1727204682.71959: variable 'ansible_module_compression' from source: unknown 41175 1727204682.71961: variable 'ansible_shell_type' from source: unknown 41175 1727204682.71964: variable 'ansible_shell_executable' from source: unknown 41175 1727204682.71969: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204682.71974: variable 'ansible_pipelining' from source: unknown 41175 1727204682.71977: variable 'ansible_timeout' from source: unknown 41175 1727204682.71982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204682.72162: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204682.72169: variable 'omit' from source: magic vars 41175 1727204682.72175: starting attempt loop 41175 1727204682.72178: running the handler 41175 1727204682.72193: _low_level_execute_command(): starting 41175 1727204682.72202: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204682.72753: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204682.72758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204682.72762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.72815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204682.72822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204682.72871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204682.74632: stdout chunk (state=3): >>>/root <<< 41175 1727204682.74743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204682.74805: stderr chunk (state=3): >>><<< 41175 1727204682.74812: stdout chunk (state=3): >>><<< 41175 1727204682.74835: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204682.74848: _low_level_execute_command(): starting 41175 1727204682.74856: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201 `" && echo ansible-tmp-1727204682.7483559-43221-216469388216201="` echo /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201 `" ) && sleep 0' 41175 1727204682.75350: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204682.75353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204682.75356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.75366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204682.75369: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204682.75371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.75418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204682.75422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204682.75470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204682.77446: stdout chunk (state=3): >>>ansible-tmp-1727204682.7483559-43221-216469388216201=/root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201 <<< 41175 1727204682.77566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204682.77653: stderr chunk (state=3): >>><<< 41175 1727204682.77656: stdout chunk (state=3): >>><<< 41175 1727204682.77660: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204682.7483559-43221-216469388216201=/root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204682.77690: variable 'ansible_module_compression' from source: unknown 41175 1727204682.77737: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41175 1727204682.77776: variable 'ansible_facts' from source: unknown 41175 1727204682.77836: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/AnsiballZ_stat.py 41175 1727204682.77960: Sending initial data 41175 1727204682.77964: Sent initial data (153 bytes) 41175 1727204682.78452: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204682.78456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204682.78459: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204682.78462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204682.78465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.78514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204682.78518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204682.78564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204682.80150: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41175 1727204682.80165: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204682.80189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204682.80219: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpmr90u3mv /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/AnsiballZ_stat.py <<< 41175 1727204682.80228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/AnsiballZ_stat.py" <<< 41175 1727204682.80256: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpmr90u3mv" to remote "/root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/AnsiballZ_stat.py" <<< 41175 1727204682.81009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204682.81078: stderr chunk (state=3): >>><<< 41175 1727204682.81081: stdout chunk (state=3): >>><<< 41175 1727204682.81104: done transferring module to remote 41175 1727204682.81114: _low_level_execute_command(): starting 41175 1727204682.81121: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/ /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/AnsiballZ_stat.py && sleep 0' 41175 1727204682.81596: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204682.81599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204682.81602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204682.81605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204682.81612: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.81660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204682.81664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204682.81705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204682.83516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204682.83565: stderr chunk (state=3): >>><<< 41175 1727204682.83569: stdout chunk (state=3): >>><<< 41175 1727204682.83584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204682.83588: _low_level_execute_command(): starting 41175 1727204682.83596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/AnsiballZ_stat.py && sleep 0' 41175 1727204682.84051: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204682.84054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.84057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204682.84059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204682.84062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204682.84112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204682.84119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204682.84158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.01271: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41175 1727204683.02657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204683.02721: stderr chunk (state=3): >>><<< 41175 1727204683.02725: stdout chunk (state=3): >>><<< 41175 1727204683.02740: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204683.02769: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204683.02781: _low_level_execute_command(): starting 41175 1727204683.02787: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204682.7483559-43221-216469388216201/ > /dev/null 2>&1 && sleep 0' 41175 1727204683.03283: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204683.03287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204683.03295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204683.03298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204683.03300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.03349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204683.03360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204683.03394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.05320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204683.05366: stderr chunk (state=3): >>><<< 41175 1727204683.05369: stdout chunk (state=3): >>><<< 41175 1727204683.05383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204683.05394: handler run complete 41175 1727204683.05416: attempt loop complete, returning result 41175 1727204683.05422: _execute() done 41175 1727204683.05425: dumping result to json 41175 1727204683.05431: done dumping result, returning 41175 1727204683.05443: done running TaskExecutor() for managed-node3/TASK: Stat profile file [12b410aa-8751-f070-39c4-00000000087c] 41175 1727204683.05448: sending task result for task 12b410aa-8751-f070-39c4-00000000087c 41175 1727204683.05555: done sending task result for task 12b410aa-8751-f070-39c4-00000000087c 41175 1727204683.05558: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 41175 1727204683.05640: no more pending results, returning what we have 41175 1727204683.05645: results queue empty 41175 1727204683.05646: checking for any_errors_fatal 41175 1727204683.05655: done checking for any_errors_fatal 41175 1727204683.05656: checking for max_fail_percentage 41175 1727204683.05658: done checking for max_fail_percentage 41175 1727204683.05659: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.05660: done checking to see if all hosts have failed 41175 1727204683.05661: getting the remaining hosts for this loop 41175 1727204683.05663: done getting the remaining hosts for this loop 41175 1727204683.05676: getting the next task for host managed-node3 41175 1727204683.05683: done getting next task for host managed-node3 41175 1727204683.05686: ^ task is: TASK: Set NM profile exist flag based on the profile files 41175 1727204683.05692: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.05696: getting variables 41175 1727204683.05697: in VariableManager get_vars() 41175 1727204683.05728: Calling all_inventory to load vars for managed-node3 41175 1727204683.05731: Calling groups_inventory to load vars for managed-node3 41175 1727204683.05736: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.05748: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.05752: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.05755: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.07166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.08774: done with get_vars() 41175 1727204683.08799: done getting variables 41175 1727204683.08852: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.385) 0:00:50.227 ***** 41175 1727204683.08878: entering _queue_task() for managed-node3/set_fact 41175 1727204683.09127: worker is 1 (out of 1 available) 41175 1727204683.09142: exiting _queue_task() for managed-node3/set_fact 41175 1727204683.09155: done queuing things up, now waiting for results queue to drain 41175 1727204683.09157: waiting for pending results... 41175 1727204683.09356: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 41175 1727204683.09450: in run() - task 12b410aa-8751-f070-39c4-00000000087d 41175 1727204683.09464: variable 'ansible_search_path' from source: unknown 41175 1727204683.09467: variable 'ansible_search_path' from source: unknown 41175 1727204683.09505: calling self._execute() 41175 1727204683.09593: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.09601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.09613: variable 'omit' from source: magic vars 41175 1727204683.09944: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.09957: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.10060: variable 'profile_stat' from source: set_fact 41175 1727204683.10078: Evaluated conditional (profile_stat.stat.exists): False 41175 1727204683.10081: when evaluation is False, skipping this task 41175 1727204683.10084: _execute() done 41175 1727204683.10087: dumping result to json 41175 1727204683.10095: done dumping result, returning 41175 1727204683.10102: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-f070-39c4-00000000087d] 41175 1727204683.10109: sending task result for task 12b410aa-8751-f070-39c4-00000000087d 41175 1727204683.10200: done sending task result for task 12b410aa-8751-f070-39c4-00000000087d 41175 1727204683.10203: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41175 1727204683.10263: no more pending results, returning what we have 41175 1727204683.10267: results queue empty 41175 1727204683.10269: checking for any_errors_fatal 41175 1727204683.10278: done checking for any_errors_fatal 41175 1727204683.10279: checking for max_fail_percentage 41175 1727204683.10281: done checking for max_fail_percentage 41175 1727204683.10282: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.10283: done checking to see if all hosts have failed 41175 1727204683.10284: getting the remaining hosts for this loop 41175 1727204683.10286: done getting the remaining hosts for this loop 41175 1727204683.10293: getting the next task for host managed-node3 41175 1727204683.10301: done getting next task for host managed-node3 41175 1727204683.10304: ^ task is: TASK: Get NM profile info 41175 1727204683.10308: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.10311: getting variables 41175 1727204683.10315: in VariableManager get_vars() 41175 1727204683.10343: Calling all_inventory to load vars for managed-node3 41175 1727204683.10346: Calling groups_inventory to load vars for managed-node3 41175 1727204683.10350: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.10361: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.10364: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.10368: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.11614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.13248: done with get_vars() 41175 1727204683.13272: done getting variables 41175 1727204683.13358: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.045) 0:00:50.272 ***** 41175 1727204683.13382: entering _queue_task() for managed-node3/shell 41175 1727204683.13384: Creating lock for shell 41175 1727204683.13646: worker is 1 (out of 1 available) 41175 1727204683.13662: exiting _queue_task() for managed-node3/shell 41175 1727204683.13675: done queuing things up, now waiting for results queue to drain 41175 1727204683.13677: waiting for pending results... 41175 1727204683.13859: running TaskExecutor() for managed-node3/TASK: Get NM profile info 41175 1727204683.13955: in run() - task 12b410aa-8751-f070-39c4-00000000087e 41175 1727204683.13967: variable 'ansible_search_path' from source: unknown 41175 1727204683.13971: variable 'ansible_search_path' from source: unknown 41175 1727204683.14004: calling self._execute() 41175 1727204683.14085: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.14093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.14105: variable 'omit' from source: magic vars 41175 1727204683.14424: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.14434: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.14440: variable 'omit' from source: magic vars 41175 1727204683.14480: variable 'omit' from source: magic vars 41175 1727204683.14563: variable 'profile' from source: include params 41175 1727204683.14568: variable 'interface' from source: set_fact 41175 1727204683.14632: variable 'interface' from source: set_fact 41175 1727204683.14648: variable 'omit' from source: magic vars 41175 1727204683.14688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204683.14721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204683.14738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204683.14756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204683.14768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204683.14802: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204683.14805: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.14808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.14898: Set connection var ansible_shell_executable to /bin/sh 41175 1727204683.14902: Set connection var ansible_shell_type to sh 41175 1727204683.14907: Set connection var ansible_pipelining to False 41175 1727204683.14919: Set connection var ansible_timeout to 10 41175 1727204683.14923: Set connection var ansible_connection to ssh 41175 1727204683.14930: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204683.14950: variable 'ansible_shell_executable' from source: unknown 41175 1727204683.14953: variable 'ansible_connection' from source: unknown 41175 1727204683.14956: variable 'ansible_module_compression' from source: unknown 41175 1727204683.14960: variable 'ansible_shell_type' from source: unknown 41175 1727204683.14962: variable 'ansible_shell_executable' from source: unknown 41175 1727204683.14967: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.14972: variable 'ansible_pipelining' from source: unknown 41175 1727204683.14976: variable 'ansible_timeout' from source: unknown 41175 1727204683.14981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.15097: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204683.15112: variable 'omit' from source: magic vars 41175 1727204683.15116: starting attempt loop 41175 1727204683.15122: running the handler 41175 1727204683.15133: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204683.15153: _low_level_execute_command(): starting 41175 1727204683.15160: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204683.15720: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204683.15724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.15727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204683.15729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.15776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204683.15796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204683.15834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.17529: stdout chunk (state=3): >>>/root <<< 41175 1727204683.17639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204683.17694: stderr chunk (state=3): >>><<< 41175 1727204683.17698: stdout chunk (state=3): >>><<< 41175 1727204683.17719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204683.17733: _low_level_execute_command(): starting 41175 1727204683.17737: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588 `" && echo ansible-tmp-1727204683.1771655-43230-254726247232588="` echo /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588 `" ) && sleep 0' 41175 1727204683.18184: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204683.18204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.18207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204683.18210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.18251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204683.18259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204683.18301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.20267: stdout chunk (state=3): >>>ansible-tmp-1727204683.1771655-43230-254726247232588=/root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588 <<< 41175 1727204683.20384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204683.20432: stderr chunk (state=3): >>><<< 41175 1727204683.20435: stdout chunk (state=3): >>><<< 41175 1727204683.20448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204683.1771655-43230-254726247232588=/root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204683.20475: variable 'ansible_module_compression' from source: unknown 41175 1727204683.20521: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204683.20554: variable 'ansible_facts' from source: unknown 41175 1727204683.20613: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/AnsiballZ_command.py 41175 1727204683.20722: Sending initial data 41175 1727204683.20726: Sent initial data (156 bytes) 41175 1727204683.21178: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204683.21181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204683.21184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204683.21186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204683.21188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.21244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204683.21251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204683.21282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.22879: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204683.22914: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204683.22950: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp4tze0yfh /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/AnsiballZ_command.py <<< 41175 1727204683.22953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/AnsiballZ_command.py" <<< 41175 1727204683.22984: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmp4tze0yfh" to remote "/root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/AnsiballZ_command.py" <<< 41175 1727204683.23759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204683.23842: stderr chunk (state=3): >>><<< 41175 1727204683.23846: stdout chunk (state=3): >>><<< 41175 1727204683.23869: done transferring module to remote 41175 1727204683.23880: _low_level_execute_command(): starting 41175 1727204683.23888: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/ /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/AnsiballZ_command.py && sleep 0' 41175 1727204683.24388: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204683.24400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204683.24403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204683.24409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.24460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204683.24464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204683.24507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.26600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204683.26606: stdout chunk (state=3): >>><<< 41175 1727204683.26609: stderr chunk (state=3): >>><<< 41175 1727204683.26612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204683.26615: _low_level_execute_command(): starting 41175 1727204683.26621: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/AnsiballZ_command.py && sleep 0' 41175 1727204683.27247: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204683.27293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204683.27310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204683.27410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204683.27441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204683.27513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.46544: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:04:43.446535", "end": "2024-09-24 15:04:43.464191", "delta": "0:00:00.017656", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204683.48145: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.90 closed. <<< 41175 1727204683.48209: stderr chunk (state=3): >>><<< 41175 1727204683.48213: stdout chunk (state=3): >>><<< 41175 1727204683.48236: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:04:43.446535", "end": "2024-09-24 15:04:43.464191", "delta": "0:00:00.017656", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.90 closed. 41175 1727204683.48269: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204683.48282: _low_level_execute_command(): starting 41175 1727204683.48294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204683.1771655-43230-254726247232588/ > /dev/null 2>&1 && sleep 0' 41175 1727204683.48776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204683.48780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204683.48786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204683.48792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204683.48795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204683.48849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204683.48853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204683.48891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204683.50819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204683.50871: stderr chunk (state=3): >>><<< 41175 1727204683.50874: stdout chunk (state=3): >>><<< 41175 1727204683.50892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204683.50900: handler run complete 41175 1727204683.50927: Evaluated conditional (False): False 41175 1727204683.50938: attempt loop complete, returning result 41175 1727204683.50941: _execute() done 41175 1727204683.50946: dumping result to json 41175 1727204683.50951: done dumping result, returning 41175 1727204683.50959: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [12b410aa-8751-f070-39c4-00000000087e] 41175 1727204683.50969: sending task result for task 12b410aa-8751-f070-39c4-00000000087e 41175 1727204683.51078: done sending task result for task 12b410aa-8751-f070-39c4-00000000087e 41175 1727204683.51081: WORKER PROCESS EXITING fatal: [managed-node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017656", "end": "2024-09-24 15:04:43.464191", "rc": 1, "start": "2024-09-24 15:04:43.446535" } MSG: non-zero return code ...ignoring 41175 1727204683.51172: no more pending results, returning what we have 41175 1727204683.51177: results queue empty 41175 1727204683.51178: checking for any_errors_fatal 41175 1727204683.51185: done checking for any_errors_fatal 41175 1727204683.51186: checking for max_fail_percentage 41175 1727204683.51188: done checking for max_fail_percentage 41175 1727204683.51191: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.51193: done checking to see if all hosts have failed 41175 1727204683.51194: getting the remaining hosts for this loop 41175 1727204683.51196: done getting the remaining hosts for this loop 41175 1727204683.51200: getting the next task for host managed-node3 41175 1727204683.51208: done getting next task for host managed-node3 41175 1727204683.51211: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41175 1727204683.51215: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.51219: getting variables 41175 1727204683.51220: in VariableManager get_vars() 41175 1727204683.51250: Calling all_inventory to load vars for managed-node3 41175 1727204683.51253: Calling groups_inventory to load vars for managed-node3 41175 1727204683.51257: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.51269: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.51272: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.51276: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.52704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.54319: done with get_vars() 41175 1727204683.54347: done getting variables 41175 1727204683.54400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.410) 0:00:50.683 ***** 41175 1727204683.54427: entering _queue_task() for managed-node3/set_fact 41175 1727204683.54693: worker is 1 (out of 1 available) 41175 1727204683.54710: exiting _queue_task() for managed-node3/set_fact 41175 1727204683.54721: done queuing things up, now waiting for results queue to drain 41175 1727204683.54724: waiting for pending results... 41175 1727204683.54923: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41175 1727204683.55025: in run() - task 12b410aa-8751-f070-39c4-00000000087f 41175 1727204683.55040: variable 'ansible_search_path' from source: unknown 41175 1727204683.55044: variable 'ansible_search_path' from source: unknown 41175 1727204683.55084: calling self._execute() 41175 1727204683.55162: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.55168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.55182: variable 'omit' from source: magic vars 41175 1727204683.55505: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.55519: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.55631: variable 'nm_profile_exists' from source: set_fact 41175 1727204683.55645: Evaluated conditional (nm_profile_exists.rc == 0): False 41175 1727204683.55649: when evaluation is False, skipping this task 41175 1727204683.55652: _execute() done 41175 1727204683.55657: dumping result to json 41175 1727204683.55660: done dumping result, returning 41175 1727204683.55669: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-f070-39c4-00000000087f] 41175 1727204683.55675: sending task result for task 12b410aa-8751-f070-39c4-00000000087f 41175 1727204683.55769: done sending task result for task 12b410aa-8751-f070-39c4-00000000087f 41175 1727204683.55772: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 41175 1727204683.55832: no more pending results, returning what we have 41175 1727204683.55837: results queue empty 41175 1727204683.55838: checking for any_errors_fatal 41175 1727204683.55851: done checking for any_errors_fatal 41175 1727204683.55852: checking for max_fail_percentage 41175 1727204683.55853: done checking for max_fail_percentage 41175 1727204683.55854: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.55855: done checking to see if all hosts have failed 41175 1727204683.55856: getting the remaining hosts for this loop 41175 1727204683.55858: done getting the remaining hosts for this loop 41175 1727204683.55863: getting the next task for host managed-node3 41175 1727204683.55873: done getting next task for host managed-node3 41175 1727204683.55876: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 41175 1727204683.55881: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.55887: getting variables 41175 1727204683.55888: in VariableManager get_vars() 41175 1727204683.55916: Calling all_inventory to load vars for managed-node3 41175 1727204683.55920: Calling groups_inventory to load vars for managed-node3 41175 1727204683.55923: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.55934: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.55938: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.55941: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.57282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.58893: done with get_vars() 41175 1727204683.58918: done getting variables 41175 1727204683.58969: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204683.59070: variable 'profile' from source: include params 41175 1727204683.59073: variable 'interface' from source: set_fact 41175 1727204683.59128: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.047) 0:00:50.730 ***** 41175 1727204683.59157: entering _queue_task() for managed-node3/command 41175 1727204683.59417: worker is 1 (out of 1 available) 41175 1727204683.59434: exiting _queue_task() for managed-node3/command 41175 1727204683.59446: done queuing things up, now waiting for results queue to drain 41175 1727204683.59448: waiting for pending results... 41175 1727204683.59643: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 41175 1727204683.59744: in run() - task 12b410aa-8751-f070-39c4-000000000881 41175 1727204683.59756: variable 'ansible_search_path' from source: unknown 41175 1727204683.59760: variable 'ansible_search_path' from source: unknown 41175 1727204683.59800: calling self._execute() 41175 1727204683.59880: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.59888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.59901: variable 'omit' from source: magic vars 41175 1727204683.60215: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.60232: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.60334: variable 'profile_stat' from source: set_fact 41175 1727204683.60351: Evaluated conditional (profile_stat.stat.exists): False 41175 1727204683.60356: when evaluation is False, skipping this task 41175 1727204683.60361: _execute() done 41175 1727204683.60365: dumping result to json 41175 1727204683.60370: done dumping result, returning 41175 1727204683.60378: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-f070-39c4-000000000881] 41175 1727204683.60384: sending task result for task 12b410aa-8751-f070-39c4-000000000881 41175 1727204683.60477: done sending task result for task 12b410aa-8751-f070-39c4-000000000881 41175 1727204683.60480: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41175 1727204683.60542: no more pending results, returning what we have 41175 1727204683.60547: results queue empty 41175 1727204683.60548: checking for any_errors_fatal 41175 1727204683.60555: done checking for any_errors_fatal 41175 1727204683.60556: checking for max_fail_percentage 41175 1727204683.60558: done checking for max_fail_percentage 41175 1727204683.60559: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.60560: done checking to see if all hosts have failed 41175 1727204683.60561: getting the remaining hosts for this loop 41175 1727204683.60563: done getting the remaining hosts for this loop 41175 1727204683.60568: getting the next task for host managed-node3 41175 1727204683.60575: done getting next task for host managed-node3 41175 1727204683.60578: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 41175 1727204683.60582: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.60586: getting variables 41175 1727204683.60588: in VariableManager get_vars() 41175 1727204683.60619: Calling all_inventory to load vars for managed-node3 41175 1727204683.60623: Calling groups_inventory to load vars for managed-node3 41175 1727204683.60627: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.60638: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.60641: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.60645: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.61891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.63519: done with get_vars() 41175 1727204683.63544: done getting variables 41175 1727204683.63598: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204683.63687: variable 'profile' from source: include params 41175 1727204683.63692: variable 'interface' from source: set_fact 41175 1727204683.63741: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.046) 0:00:50.776 ***** 41175 1727204683.63771: entering _queue_task() for managed-node3/set_fact 41175 1727204683.64029: worker is 1 (out of 1 available) 41175 1727204683.64044: exiting _queue_task() for managed-node3/set_fact 41175 1727204683.64056: done queuing things up, now waiting for results queue to drain 41175 1727204683.64058: waiting for pending results... 41175 1727204683.64242: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 41175 1727204683.64331: in run() - task 12b410aa-8751-f070-39c4-000000000882 41175 1727204683.64346: variable 'ansible_search_path' from source: unknown 41175 1727204683.64351: variable 'ansible_search_path' from source: unknown 41175 1727204683.64381: calling self._execute() 41175 1727204683.64476: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.64483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.64495: variable 'omit' from source: magic vars 41175 1727204683.64812: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.64828: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.64932: variable 'profile_stat' from source: set_fact 41175 1727204683.64947: Evaluated conditional (profile_stat.stat.exists): False 41175 1727204683.64952: when evaluation is False, skipping this task 41175 1727204683.64956: _execute() done 41175 1727204683.64958: dumping result to json 41175 1727204683.64961: done dumping result, returning 41175 1727204683.64971: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-f070-39c4-000000000882] 41175 1727204683.64975: sending task result for task 12b410aa-8751-f070-39c4-000000000882 41175 1727204683.65067: done sending task result for task 12b410aa-8751-f070-39c4-000000000882 41175 1727204683.65072: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41175 1727204683.65125: no more pending results, returning what we have 41175 1727204683.65130: results queue empty 41175 1727204683.65131: checking for any_errors_fatal 41175 1727204683.65139: done checking for any_errors_fatal 41175 1727204683.65139: checking for max_fail_percentage 41175 1727204683.65141: done checking for max_fail_percentage 41175 1727204683.65143: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.65144: done checking to see if all hosts have failed 41175 1727204683.65144: getting the remaining hosts for this loop 41175 1727204683.65146: done getting the remaining hosts for this loop 41175 1727204683.65152: getting the next task for host managed-node3 41175 1727204683.65160: done getting next task for host managed-node3 41175 1727204683.65162: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 41175 1727204683.65167: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.65170: getting variables 41175 1727204683.65172: in VariableManager get_vars() 41175 1727204683.65199: Calling all_inventory to load vars for managed-node3 41175 1727204683.65203: Calling groups_inventory to load vars for managed-node3 41175 1727204683.65206: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.65218: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.65221: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.65225: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.70150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.71744: done with get_vars() 41175 1727204683.71771: done getting variables 41175 1727204683.71816: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204683.71899: variable 'profile' from source: include params 41175 1727204683.71902: variable 'interface' from source: set_fact 41175 1727204683.71953: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.082) 0:00:50.858 ***** 41175 1727204683.71976: entering _queue_task() for managed-node3/command 41175 1727204683.72301: worker is 1 (out of 1 available) 41175 1727204683.72316: exiting _queue_task() for managed-node3/command 41175 1727204683.72329: done queuing things up, now waiting for results queue to drain 41175 1727204683.72331: waiting for pending results... 41175 1727204683.72582: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 41175 1727204683.72688: in run() - task 12b410aa-8751-f070-39c4-000000000883 41175 1727204683.72704: variable 'ansible_search_path' from source: unknown 41175 1727204683.72707: variable 'ansible_search_path' from source: unknown 41175 1727204683.72743: calling self._execute() 41175 1727204683.72836: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.72844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.72854: variable 'omit' from source: magic vars 41175 1727204683.73195: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.73208: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.73316: variable 'profile_stat' from source: set_fact 41175 1727204683.73335: Evaluated conditional (profile_stat.stat.exists): False 41175 1727204683.73339: when evaluation is False, skipping this task 41175 1727204683.73342: _execute() done 41175 1727204683.73347: dumping result to json 41175 1727204683.73350: done dumping result, returning 41175 1727204683.73358: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-f070-39c4-000000000883] 41175 1727204683.73365: sending task result for task 12b410aa-8751-f070-39c4-000000000883 41175 1727204683.73461: done sending task result for task 12b410aa-8751-f070-39c4-000000000883 41175 1727204683.73464: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41175 1727204683.73524: no more pending results, returning what we have 41175 1727204683.73529: results queue empty 41175 1727204683.73530: checking for any_errors_fatal 41175 1727204683.73540: done checking for any_errors_fatal 41175 1727204683.73541: checking for max_fail_percentage 41175 1727204683.73543: done checking for max_fail_percentage 41175 1727204683.73544: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.73545: done checking to see if all hosts have failed 41175 1727204683.73546: getting the remaining hosts for this loop 41175 1727204683.73548: done getting the remaining hosts for this loop 41175 1727204683.73554: getting the next task for host managed-node3 41175 1727204683.73563: done getting next task for host managed-node3 41175 1727204683.73565: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 41175 1727204683.73570: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.73577: getting variables 41175 1727204683.73579: in VariableManager get_vars() 41175 1727204683.73617: Calling all_inventory to load vars for managed-node3 41175 1727204683.73621: Calling groups_inventory to load vars for managed-node3 41175 1727204683.73625: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.73637: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.73641: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.73644: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.75771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.78081: done with get_vars() 41175 1727204683.78109: done getting variables 41175 1727204683.78164: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204683.78269: variable 'profile' from source: include params 41175 1727204683.78273: variable 'interface' from source: set_fact 41175 1727204683.78328: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.063) 0:00:50.922 ***** 41175 1727204683.78355: entering _queue_task() for managed-node3/set_fact 41175 1727204683.78637: worker is 1 (out of 1 available) 41175 1727204683.78653: exiting _queue_task() for managed-node3/set_fact 41175 1727204683.78665: done queuing things up, now waiting for results queue to drain 41175 1727204683.78667: waiting for pending results... 41175 1727204683.79029: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 41175 1727204683.79077: in run() - task 12b410aa-8751-f070-39c4-000000000884 41175 1727204683.79105: variable 'ansible_search_path' from source: unknown 41175 1727204683.79116: variable 'ansible_search_path' from source: unknown 41175 1727204683.79164: calling self._execute() 41175 1727204683.79282: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.79299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.79318: variable 'omit' from source: magic vars 41175 1727204683.79762: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.79782: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.79940: variable 'profile_stat' from source: set_fact 41175 1727204683.79964: Evaluated conditional (profile_stat.stat.exists): False 41175 1727204683.79973: when evaluation is False, skipping this task 41175 1727204683.79995: _execute() done 41175 1727204683.79997: dumping result to json 41175 1727204683.80000: done dumping result, returning 41175 1727204683.80095: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-f070-39c4-000000000884] 41175 1727204683.80099: sending task result for task 12b410aa-8751-f070-39c4-000000000884 41175 1727204683.80171: done sending task result for task 12b410aa-8751-f070-39c4-000000000884 41175 1727204683.80175: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41175 1727204683.80236: no more pending results, returning what we have 41175 1727204683.80241: results queue empty 41175 1727204683.80242: checking for any_errors_fatal 41175 1727204683.80250: done checking for any_errors_fatal 41175 1727204683.80251: checking for max_fail_percentage 41175 1727204683.80252: done checking for max_fail_percentage 41175 1727204683.80253: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.80254: done checking to see if all hosts have failed 41175 1727204683.80255: getting the remaining hosts for this loop 41175 1727204683.80257: done getting the remaining hosts for this loop 41175 1727204683.80262: getting the next task for host managed-node3 41175 1727204683.80271: done getting next task for host managed-node3 41175 1727204683.80274: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 41175 1727204683.80277: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.80282: getting variables 41175 1727204683.80284: in VariableManager get_vars() 41175 1727204683.80320: Calling all_inventory to load vars for managed-node3 41175 1727204683.80324: Calling groups_inventory to load vars for managed-node3 41175 1727204683.80335: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.80348: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.80352: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.80355: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.82935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.86000: done with get_vars() 41175 1727204683.86044: done getting variables 41175 1727204683.86118: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204683.86262: variable 'profile' from source: include params 41175 1727204683.86266: variable 'interface' from source: set_fact 41175 1727204683.86341: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.080) 0:00:51.002 ***** 41175 1727204683.86378: entering _queue_task() for managed-node3/assert 41175 1727204683.86746: worker is 1 (out of 1 available) 41175 1727204683.86761: exiting _queue_task() for managed-node3/assert 41175 1727204683.86773: done queuing things up, now waiting for results queue to drain 41175 1727204683.86775: waiting for pending results... 41175 1727204683.87211: running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' 41175 1727204683.87216: in run() - task 12b410aa-8751-f070-39c4-00000000086d 41175 1727204683.87235: variable 'ansible_search_path' from source: unknown 41175 1727204683.87243: variable 'ansible_search_path' from source: unknown 41175 1727204683.87286: calling self._execute() 41175 1727204683.87409: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.87425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.87446: variable 'omit' from source: magic vars 41175 1727204683.87893: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.87913: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.87926: variable 'omit' from source: magic vars 41175 1727204683.87984: variable 'omit' from source: magic vars 41175 1727204683.88119: variable 'profile' from source: include params 41175 1727204683.88196: variable 'interface' from source: set_fact 41175 1727204683.88216: variable 'interface' from source: set_fact 41175 1727204683.88243: variable 'omit' from source: magic vars 41175 1727204683.88296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204683.88346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204683.88374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204683.88403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204683.88428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204683.88471: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204683.88480: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.88492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.88792: Set connection var ansible_shell_executable to /bin/sh 41175 1727204683.88797: Set connection var ansible_shell_type to sh 41175 1727204683.88799: Set connection var ansible_pipelining to False 41175 1727204683.88801: Set connection var ansible_timeout to 10 41175 1727204683.88803: Set connection var ansible_connection to ssh 41175 1727204683.88805: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204683.88807: variable 'ansible_shell_executable' from source: unknown 41175 1727204683.88809: variable 'ansible_connection' from source: unknown 41175 1727204683.88811: variable 'ansible_module_compression' from source: unknown 41175 1727204683.88813: variable 'ansible_shell_type' from source: unknown 41175 1727204683.88815: variable 'ansible_shell_executable' from source: unknown 41175 1727204683.88817: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.88819: variable 'ansible_pipelining' from source: unknown 41175 1727204683.88821: variable 'ansible_timeout' from source: unknown 41175 1727204683.88823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.88907: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204683.88927: variable 'omit' from source: magic vars 41175 1727204683.88943: starting attempt loop 41175 1727204683.88949: running the handler 41175 1727204683.89102: variable 'lsr_net_profile_exists' from source: set_fact 41175 1727204683.89115: Evaluated conditional (not lsr_net_profile_exists): True 41175 1727204683.89127: handler run complete 41175 1727204683.89158: attempt loop complete, returning result 41175 1727204683.89167: _execute() done 41175 1727204683.89175: dumping result to json 41175 1727204683.89184: done dumping result, returning 41175 1727204683.89199: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' [12b410aa-8751-f070-39c4-00000000086d] 41175 1727204683.89211: sending task result for task 12b410aa-8751-f070-39c4-00000000086d 41175 1727204683.89497: done sending task result for task 12b410aa-8751-f070-39c4-00000000086d 41175 1727204683.89500: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41175 1727204683.89557: no more pending results, returning what we have 41175 1727204683.89563: results queue empty 41175 1727204683.89564: checking for any_errors_fatal 41175 1727204683.89573: done checking for any_errors_fatal 41175 1727204683.89574: checking for max_fail_percentage 41175 1727204683.89575: done checking for max_fail_percentage 41175 1727204683.89577: checking to see if all hosts have failed and the running result is not ok 41175 1727204683.89578: done checking to see if all hosts have failed 41175 1727204683.89579: getting the remaining hosts for this loop 41175 1727204683.89581: done getting the remaining hosts for this loop 41175 1727204683.89587: getting the next task for host managed-node3 41175 1727204683.89597: done getting next task for host managed-node3 41175 1727204683.89602: ^ task is: TASK: Include the task 'assert_device_absent.yml' 41175 1727204683.89604: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204683.89610: getting variables 41175 1727204683.89612: in VariableManager get_vars() 41175 1727204683.89648: Calling all_inventory to load vars for managed-node3 41175 1727204683.89652: Calling groups_inventory to load vars for managed-node3 41175 1727204683.89657: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.89672: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.89677: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.89681: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.92291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204683.95086: done with get_vars() 41175 1727204683.95121: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:156 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.088) 0:00:51.090 ***** 41175 1727204683.95201: entering _queue_task() for managed-node3/include_tasks 41175 1727204683.95470: worker is 1 (out of 1 available) 41175 1727204683.95487: exiting _queue_task() for managed-node3/include_tasks 41175 1727204683.95499: done queuing things up, now waiting for results queue to drain 41175 1727204683.95501: waiting for pending results... 41175 1727204683.95703: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' 41175 1727204683.95786: in run() - task 12b410aa-8751-f070-39c4-0000000000f0 41175 1727204683.95800: variable 'ansible_search_path' from source: unknown 41175 1727204683.95841: calling self._execute() 41175 1727204683.95925: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204683.95933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204683.95946: variable 'omit' from source: magic vars 41175 1727204683.96263: variable 'ansible_distribution_major_version' from source: facts 41175 1727204683.96276: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204683.96283: _execute() done 41175 1727204683.96295: dumping result to json 41175 1727204683.96298: done dumping result, returning 41175 1727204683.96307: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' [12b410aa-8751-f070-39c4-0000000000f0] 41175 1727204683.96314: sending task result for task 12b410aa-8751-f070-39c4-0000000000f0 41175 1727204683.96412: done sending task result for task 12b410aa-8751-f070-39c4-0000000000f0 41175 1727204683.96415: WORKER PROCESS EXITING 41175 1727204683.96448: no more pending results, returning what we have 41175 1727204683.96453: in VariableManager get_vars() 41175 1727204683.96492: Calling all_inventory to load vars for managed-node3 41175 1727204683.96496: Calling groups_inventory to load vars for managed-node3 41175 1727204683.96500: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204683.96515: Calling all_plugins_play to load vars for managed-node3 41175 1727204683.96522: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204683.96526: Calling groups_plugins_play to load vars for managed-node3 41175 1727204683.98368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.00171: done with get_vars() 41175 1727204684.00203: variable 'ansible_search_path' from source: unknown 41175 1727204684.00218: we have included files to process 41175 1727204684.00220: generating all_blocks data 41175 1727204684.00222: done generating all_blocks data 41175 1727204684.00229: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41175 1727204684.00230: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41175 1727204684.00234: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41175 1727204684.00422: in VariableManager get_vars() 41175 1727204684.00441: done with get_vars() 41175 1727204684.00576: done processing included file 41175 1727204684.00579: iterating over new_blocks loaded from include file 41175 1727204684.00581: in VariableManager get_vars() 41175 1727204684.00597: done with get_vars() 41175 1727204684.00599: filtering new block on tags 41175 1727204684.00622: done filtering new block on tags 41175 1727204684.00625: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node3 41175 1727204684.00631: extending task lists for all hosts with included blocks 41175 1727204684.00845: done extending task lists 41175 1727204684.00847: done processing included files 41175 1727204684.00848: results queue empty 41175 1727204684.00849: checking for any_errors_fatal 41175 1727204684.00853: done checking for any_errors_fatal 41175 1727204684.00854: checking for max_fail_percentage 41175 1727204684.00855: done checking for max_fail_percentage 41175 1727204684.00856: checking to see if all hosts have failed and the running result is not ok 41175 1727204684.00857: done checking to see if all hosts have failed 41175 1727204684.00858: getting the remaining hosts for this loop 41175 1727204684.00859: done getting the remaining hosts for this loop 41175 1727204684.00862: getting the next task for host managed-node3 41175 1727204684.00867: done getting next task for host managed-node3 41175 1727204684.00869: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41175 1727204684.00872: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204684.00875: getting variables 41175 1727204684.00876: in VariableManager get_vars() 41175 1727204684.00886: Calling all_inventory to load vars for managed-node3 41175 1727204684.00891: Calling groups_inventory to load vars for managed-node3 41175 1727204684.00894: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204684.00900: Calling all_plugins_play to load vars for managed-node3 41175 1727204684.00904: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204684.00908: Calling groups_plugins_play to load vars for managed-node3 41175 1727204684.02416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.04438: done with get_vars() 41175 1727204684.04482: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.093) 0:00:51.184 ***** 41175 1727204684.04576: entering _queue_task() for managed-node3/include_tasks 41175 1727204684.04959: worker is 1 (out of 1 available) 41175 1727204684.04972: exiting _queue_task() for managed-node3/include_tasks 41175 1727204684.04985: done queuing things up, now waiting for results queue to drain 41175 1727204684.04987: waiting for pending results... 41175 1727204684.05333: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 41175 1727204684.05534: in run() - task 12b410aa-8751-f070-39c4-0000000008b5 41175 1727204684.05538: variable 'ansible_search_path' from source: unknown 41175 1727204684.05541: variable 'ansible_search_path' from source: unknown 41175 1727204684.05561: calling self._execute() 41175 1727204684.05663: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.05671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.05800: variable 'omit' from source: magic vars 41175 1727204684.06115: variable 'ansible_distribution_major_version' from source: facts 41175 1727204684.06136: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204684.06149: _execute() done 41175 1727204684.06159: dumping result to json 41175 1727204684.06169: done dumping result, returning 41175 1727204684.06180: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-f070-39c4-0000000008b5] 41175 1727204684.06196: sending task result for task 12b410aa-8751-f070-39c4-0000000008b5 41175 1727204684.06336: no more pending results, returning what we have 41175 1727204684.06342: in VariableManager get_vars() 41175 1727204684.06423: Calling all_inventory to load vars for managed-node3 41175 1727204684.06427: Calling groups_inventory to load vars for managed-node3 41175 1727204684.06432: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204684.06452: Calling all_plugins_play to load vars for managed-node3 41175 1727204684.06457: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204684.06462: Calling groups_plugins_play to load vars for managed-node3 41175 1727204684.06987: done sending task result for task 12b410aa-8751-f070-39c4-0000000008b5 41175 1727204684.06993: WORKER PROCESS EXITING 41175 1727204684.08218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.10153: done with get_vars() 41175 1727204684.10184: variable 'ansible_search_path' from source: unknown 41175 1727204684.10185: variable 'ansible_search_path' from source: unknown 41175 1727204684.10232: we have included files to process 41175 1727204684.10234: generating all_blocks data 41175 1727204684.10236: done generating all_blocks data 41175 1727204684.10237: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41175 1727204684.10238: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41175 1727204684.10241: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41175 1727204684.10466: done processing included file 41175 1727204684.10468: iterating over new_blocks loaded from include file 41175 1727204684.10470: in VariableManager get_vars() 41175 1727204684.10485: done with get_vars() 41175 1727204684.10487: filtering new block on tags 41175 1727204684.10508: done filtering new block on tags 41175 1727204684.10510: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 41175 1727204684.10515: extending task lists for all hosts with included blocks 41175 1727204684.10597: done extending task lists 41175 1727204684.10598: done processing included files 41175 1727204684.10598: results queue empty 41175 1727204684.10599: checking for any_errors_fatal 41175 1727204684.10601: done checking for any_errors_fatal 41175 1727204684.10602: checking for max_fail_percentage 41175 1727204684.10603: done checking for max_fail_percentage 41175 1727204684.10604: checking to see if all hosts have failed and the running result is not ok 41175 1727204684.10604: done checking to see if all hosts have failed 41175 1727204684.10605: getting the remaining hosts for this loop 41175 1727204684.10606: done getting the remaining hosts for this loop 41175 1727204684.10608: getting the next task for host managed-node3 41175 1727204684.10611: done getting next task for host managed-node3 41175 1727204684.10612: ^ task is: TASK: Get stat for interface {{ interface }} 41175 1727204684.10615: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204684.10617: getting variables 41175 1727204684.10618: in VariableManager get_vars() 41175 1727204684.10625: Calling all_inventory to load vars for managed-node3 41175 1727204684.10627: Calling groups_inventory to load vars for managed-node3 41175 1727204684.10628: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204684.10632: Calling all_plugins_play to load vars for managed-node3 41175 1727204684.10634: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204684.10637: Calling groups_plugins_play to load vars for managed-node3 41175 1727204684.11798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.13385: done with get_vars() 41175 1727204684.13408: done getting variables 41175 1727204684.13537: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.089) 0:00:51.274 ***** 41175 1727204684.13562: entering _queue_task() for managed-node3/stat 41175 1727204684.13816: worker is 1 (out of 1 available) 41175 1727204684.13830: exiting _queue_task() for managed-node3/stat 41175 1727204684.13841: done queuing things up, now waiting for results queue to drain 41175 1727204684.13843: waiting for pending results... 41175 1727204684.14041: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 41175 1727204684.14149: in run() - task 12b410aa-8751-f070-39c4-0000000008cf 41175 1727204684.14162: variable 'ansible_search_path' from source: unknown 41175 1727204684.14166: variable 'ansible_search_path' from source: unknown 41175 1727204684.14201: calling self._execute() 41175 1727204684.14278: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.14286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.14301: variable 'omit' from source: magic vars 41175 1727204684.14612: variable 'ansible_distribution_major_version' from source: facts 41175 1727204684.14631: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204684.14634: variable 'omit' from source: magic vars 41175 1727204684.14673: variable 'omit' from source: magic vars 41175 1727204684.14760: variable 'interface' from source: set_fact 41175 1727204684.14776: variable 'omit' from source: magic vars 41175 1727204684.14821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204684.14855: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204684.14872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204684.14892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204684.14903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204684.14934: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204684.14937: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.14941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.15029: Set connection var ansible_shell_executable to /bin/sh 41175 1727204684.15034: Set connection var ansible_shell_type to sh 41175 1727204684.15039: Set connection var ansible_pipelining to False 41175 1727204684.15048: Set connection var ansible_timeout to 10 41175 1727204684.15054: Set connection var ansible_connection to ssh 41175 1727204684.15061: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204684.15087: variable 'ansible_shell_executable' from source: unknown 41175 1727204684.15091: variable 'ansible_connection' from source: unknown 41175 1727204684.15099: variable 'ansible_module_compression' from source: unknown 41175 1727204684.15102: variable 'ansible_shell_type' from source: unknown 41175 1727204684.15105: variable 'ansible_shell_executable' from source: unknown 41175 1727204684.15110: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.15115: variable 'ansible_pipelining' from source: unknown 41175 1727204684.15120: variable 'ansible_timeout' from source: unknown 41175 1727204684.15125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.15300: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41175 1727204684.15312: variable 'omit' from source: magic vars 41175 1727204684.15321: starting attempt loop 41175 1727204684.15324: running the handler 41175 1727204684.15337: _low_level_execute_command(): starting 41175 1727204684.15345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204684.15884: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.15892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204684.15895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.15959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.15963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204684.15965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.16004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.17752: stdout chunk (state=3): >>>/root <<< 41175 1727204684.17860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.17914: stderr chunk (state=3): >>><<< 41175 1727204684.17918: stdout chunk (state=3): >>><<< 41175 1727204684.17943: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204684.17956: _low_level_execute_command(): starting 41175 1727204684.17963: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021 `" && echo ansible-tmp-1727204684.1794312-43261-241324512566021="` echo /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021 `" ) && sleep 0' 41175 1727204684.18420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204684.18424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204684.18429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41175 1727204684.18439: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204684.18442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.18485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.18492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.18528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.20513: stdout chunk (state=3): >>>ansible-tmp-1727204684.1794312-43261-241324512566021=/root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021 <<< 41175 1727204684.20631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.20676: stderr chunk (state=3): >>><<< 41175 1727204684.20680: stdout chunk (state=3): >>><<< 41175 1727204684.20698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204684.1794312-43261-241324512566021=/root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204684.20742: variable 'ansible_module_compression' from source: unknown 41175 1727204684.20786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41175 1727204684.20825: variable 'ansible_facts' from source: unknown 41175 1727204684.20877: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/AnsiballZ_stat.py 41175 1727204684.20986: Sending initial data 41175 1727204684.20993: Sent initial data (153 bytes) 41175 1727204684.21457: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.21460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204684.21463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204684.21467: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204684.21470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.21520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.21524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.21564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.23160: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41175 1727204684.23164: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204684.23194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204684.23231: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpolwlzuli /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/AnsiballZ_stat.py <<< 41175 1727204684.23237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/AnsiballZ_stat.py" <<< 41175 1727204684.23264: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpolwlzuli" to remote "/root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/AnsiballZ_stat.py" <<< 41175 1727204684.24030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.24092: stderr chunk (state=3): >>><<< 41175 1727204684.24095: stdout chunk (state=3): >>><<< 41175 1727204684.24115: done transferring module to remote 41175 1727204684.24128: _low_level_execute_command(): starting 41175 1727204684.24134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/ /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/AnsiballZ_stat.py && sleep 0' 41175 1727204684.24582: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204684.24585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204684.24588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 41175 1727204684.24593: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 41175 1727204684.24599: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.24649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.24654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.24687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.26510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.26560: stderr chunk (state=3): >>><<< 41175 1727204684.26564: stdout chunk (state=3): >>><<< 41175 1727204684.26580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204684.26585: _low_level_execute_command(): starting 41175 1727204684.26591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/AnsiballZ_stat.py && sleep 0' 41175 1727204684.27060: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204684.27063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41175 1727204684.27066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.27068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.27071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.27125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.27134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.27173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.44435: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41175 1727204684.45846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 41175 1727204684.45908: stderr chunk (state=3): >>><<< 41175 1727204684.45912: stdout chunk (state=3): >>><<< 41175 1727204684.45930: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204684.45965: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204684.45978: _low_level_execute_command(): starting 41175 1727204684.45984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204684.1794312-43261-241324512566021/ > /dev/null 2>&1 && sleep 0' 41175 1727204684.46471: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.46475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204684.46478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204684.46480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204684.46482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.46543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204684.46550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.46581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.48522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.48567: stderr chunk (state=3): >>><<< 41175 1727204684.48572: stdout chunk (state=3): >>><<< 41175 1727204684.48587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204684.48597: handler run complete 41175 1727204684.48618: attempt loop complete, returning result 41175 1727204684.48624: _execute() done 41175 1727204684.48627: dumping result to json 41175 1727204684.48632: done dumping result, returning 41175 1727204684.48640: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [12b410aa-8751-f070-39c4-0000000008cf] 41175 1727204684.48646: sending task result for task 12b410aa-8751-f070-39c4-0000000008cf 41175 1727204684.48747: done sending task result for task 12b410aa-8751-f070-39c4-0000000008cf 41175 1727204684.48751: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 41175 1727204684.48843: no more pending results, returning what we have 41175 1727204684.48848: results queue empty 41175 1727204684.48849: checking for any_errors_fatal 41175 1727204684.48851: done checking for any_errors_fatal 41175 1727204684.48851: checking for max_fail_percentage 41175 1727204684.48853: done checking for max_fail_percentage 41175 1727204684.48855: checking to see if all hosts have failed and the running result is not ok 41175 1727204684.48856: done checking to see if all hosts have failed 41175 1727204684.48857: getting the remaining hosts for this loop 41175 1727204684.48859: done getting the remaining hosts for this loop 41175 1727204684.48866: getting the next task for host managed-node3 41175 1727204684.48875: done getting next task for host managed-node3 41175 1727204684.48878: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 41175 1727204684.48881: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204684.48886: getting variables 41175 1727204684.48888: in VariableManager get_vars() 41175 1727204684.48920: Calling all_inventory to load vars for managed-node3 41175 1727204684.48924: Calling groups_inventory to load vars for managed-node3 41175 1727204684.48928: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204684.48940: Calling all_plugins_play to load vars for managed-node3 41175 1727204684.48943: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204684.48946: Calling groups_plugins_play to load vars for managed-node3 41175 1727204684.50243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.51882: done with get_vars() 41175 1727204684.51907: done getting variables 41175 1727204684.51964: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41175 1727204684.52066: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.385) 0:00:51.659 ***** 41175 1727204684.52093: entering _queue_task() for managed-node3/assert 41175 1727204684.52344: worker is 1 (out of 1 available) 41175 1727204684.52360: exiting _queue_task() for managed-node3/assert 41175 1727204684.52371: done queuing things up, now waiting for results queue to drain 41175 1727204684.52373: waiting for pending results... 41175 1727204684.52563: running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' 41175 1727204684.52643: in run() - task 12b410aa-8751-f070-39c4-0000000008b6 41175 1727204684.52656: variable 'ansible_search_path' from source: unknown 41175 1727204684.52660: variable 'ansible_search_path' from source: unknown 41175 1727204684.52693: calling self._execute() 41175 1727204684.52782: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.52790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.52800: variable 'omit' from source: magic vars 41175 1727204684.53121: variable 'ansible_distribution_major_version' from source: facts 41175 1727204684.53131: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204684.53138: variable 'omit' from source: magic vars 41175 1727204684.53172: variable 'omit' from source: magic vars 41175 1727204684.53257: variable 'interface' from source: set_fact 41175 1727204684.53276: variable 'omit' from source: magic vars 41175 1727204684.53315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204684.53347: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204684.53367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204684.53386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204684.53399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204684.53428: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204684.53432: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.53436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.53526: Set connection var ansible_shell_executable to /bin/sh 41175 1727204684.53530: Set connection var ansible_shell_type to sh 41175 1727204684.53536: Set connection var ansible_pipelining to False 41175 1727204684.53545: Set connection var ansible_timeout to 10 41175 1727204684.53552: Set connection var ansible_connection to ssh 41175 1727204684.53558: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204684.53579: variable 'ansible_shell_executable' from source: unknown 41175 1727204684.53583: variable 'ansible_connection' from source: unknown 41175 1727204684.53585: variable 'ansible_module_compression' from source: unknown 41175 1727204684.53590: variable 'ansible_shell_type' from source: unknown 41175 1727204684.53600: variable 'ansible_shell_executable' from source: unknown 41175 1727204684.53603: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.53606: variable 'ansible_pipelining' from source: unknown 41175 1727204684.53609: variable 'ansible_timeout' from source: unknown 41175 1727204684.53611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.53736: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204684.53748: variable 'omit' from source: magic vars 41175 1727204684.53754: starting attempt loop 41175 1727204684.53757: running the handler 41175 1727204684.53884: variable 'interface_stat' from source: set_fact 41175 1727204684.53895: Evaluated conditional (not interface_stat.stat.exists): True 41175 1727204684.53905: handler run complete 41175 1727204684.53923: attempt loop complete, returning result 41175 1727204684.53926: _execute() done 41175 1727204684.53929: dumping result to json 41175 1727204684.53931: done dumping result, returning 41175 1727204684.53941: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' [12b410aa-8751-f070-39c4-0000000008b6] 41175 1727204684.53944: sending task result for task 12b410aa-8751-f070-39c4-0000000008b6 41175 1727204684.54039: done sending task result for task 12b410aa-8751-f070-39c4-0000000008b6 41175 1727204684.54044: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41175 1727204684.54096: no more pending results, returning what we have 41175 1727204684.54101: results queue empty 41175 1727204684.54102: checking for any_errors_fatal 41175 1727204684.54111: done checking for any_errors_fatal 41175 1727204684.54112: checking for max_fail_percentage 41175 1727204684.54114: done checking for max_fail_percentage 41175 1727204684.54115: checking to see if all hosts have failed and the running result is not ok 41175 1727204684.54119: done checking to see if all hosts have failed 41175 1727204684.54120: getting the remaining hosts for this loop 41175 1727204684.54121: done getting the remaining hosts for this loop 41175 1727204684.54126: getting the next task for host managed-node3 41175 1727204684.54134: done getting next task for host managed-node3 41175 1727204684.54138: ^ task is: TASK: Verify network state restored to default 41175 1727204684.54140: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204684.54144: getting variables 41175 1727204684.54145: in VariableManager get_vars() 41175 1727204684.54174: Calling all_inventory to load vars for managed-node3 41175 1727204684.54177: Calling groups_inventory to load vars for managed-node3 41175 1727204684.54181: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204684.54201: Calling all_plugins_play to load vars for managed-node3 41175 1727204684.54205: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204684.54209: Calling groups_plugins_play to load vars for managed-node3 41175 1727204684.55646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.57263: done with get_vars() 41175 1727204684.57287: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:158 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.052) 0:00:51.712 ***** 41175 1727204684.57370: entering _queue_task() for managed-node3/include_tasks 41175 1727204684.57639: worker is 1 (out of 1 available) 41175 1727204684.57654: exiting _queue_task() for managed-node3/include_tasks 41175 1727204684.57665: done queuing things up, now waiting for results queue to drain 41175 1727204684.57667: waiting for pending results... 41175 1727204684.57857: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 41175 1727204684.57935: in run() - task 12b410aa-8751-f070-39c4-0000000000f1 41175 1727204684.57951: variable 'ansible_search_path' from source: unknown 41175 1727204684.57983: calling self._execute() 41175 1727204684.58069: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.58076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.58087: variable 'omit' from source: magic vars 41175 1727204684.58406: variable 'ansible_distribution_major_version' from source: facts 41175 1727204684.58420: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204684.58424: _execute() done 41175 1727204684.58428: dumping result to json 41175 1727204684.58431: done dumping result, returning 41175 1727204684.58440: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [12b410aa-8751-f070-39c4-0000000000f1] 41175 1727204684.58451: sending task result for task 12b410aa-8751-f070-39c4-0000000000f1 41175 1727204684.58546: done sending task result for task 12b410aa-8751-f070-39c4-0000000000f1 41175 1727204684.58551: WORKER PROCESS EXITING 41175 1727204684.58581: no more pending results, returning what we have 41175 1727204684.58587: in VariableManager get_vars() 41175 1727204684.58627: Calling all_inventory to load vars for managed-node3 41175 1727204684.58630: Calling groups_inventory to load vars for managed-node3 41175 1727204684.58635: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204684.58647: Calling all_plugins_play to load vars for managed-node3 41175 1727204684.58650: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204684.58654: Calling groups_plugins_play to load vars for managed-node3 41175 1727204684.60019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.61633: done with get_vars() 41175 1727204684.61652: variable 'ansible_search_path' from source: unknown 41175 1727204684.61663: we have included files to process 41175 1727204684.61664: generating all_blocks data 41175 1727204684.61665: done generating all_blocks data 41175 1727204684.61668: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41175 1727204684.61669: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41175 1727204684.61671: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41175 1727204684.61996: done processing included file 41175 1727204684.61998: iterating over new_blocks loaded from include file 41175 1727204684.61999: in VariableManager get_vars() 41175 1727204684.62009: done with get_vars() 41175 1727204684.62011: filtering new block on tags 41175 1727204684.62028: done filtering new block on tags 41175 1727204684.62029: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 41175 1727204684.62033: extending task lists for all hosts with included blocks 41175 1727204684.62232: done extending task lists 41175 1727204684.62233: done processing included files 41175 1727204684.62234: results queue empty 41175 1727204684.62235: checking for any_errors_fatal 41175 1727204684.62237: done checking for any_errors_fatal 41175 1727204684.62237: checking for max_fail_percentage 41175 1727204684.62238: done checking for max_fail_percentage 41175 1727204684.62239: checking to see if all hosts have failed and the running result is not ok 41175 1727204684.62240: done checking to see if all hosts have failed 41175 1727204684.62240: getting the remaining hosts for this loop 41175 1727204684.62241: done getting the remaining hosts for this loop 41175 1727204684.62243: getting the next task for host managed-node3 41175 1727204684.62246: done getting next task for host managed-node3 41175 1727204684.62248: ^ task is: TASK: Check routes and DNS 41175 1727204684.62249: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204684.62251: getting variables 41175 1727204684.62252: in VariableManager get_vars() 41175 1727204684.62258: Calling all_inventory to load vars for managed-node3 41175 1727204684.62260: Calling groups_inventory to load vars for managed-node3 41175 1727204684.62262: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204684.62266: Calling all_plugins_play to load vars for managed-node3 41175 1727204684.62268: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204684.62270: Calling groups_plugins_play to load vars for managed-node3 41175 1727204684.63390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204684.64986: done with get_vars() 41175 1727204684.65011: done getting variables 41175 1727204684.65051: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.077) 0:00:51.789 ***** 41175 1727204684.65075: entering _queue_task() for managed-node3/shell 41175 1727204684.65354: worker is 1 (out of 1 available) 41175 1727204684.65368: exiting _queue_task() for managed-node3/shell 41175 1727204684.65380: done queuing things up, now waiting for results queue to drain 41175 1727204684.65382: waiting for pending results... 41175 1727204684.65573: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 41175 1727204684.65665: in run() - task 12b410aa-8751-f070-39c4-0000000008e7 41175 1727204684.65678: variable 'ansible_search_path' from source: unknown 41175 1727204684.65683: variable 'ansible_search_path' from source: unknown 41175 1727204684.65723: calling self._execute() 41175 1727204684.65804: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.65811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.65825: variable 'omit' from source: magic vars 41175 1727204684.66148: variable 'ansible_distribution_major_version' from source: facts 41175 1727204684.66160: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204684.66170: variable 'omit' from source: magic vars 41175 1727204684.66207: variable 'omit' from source: magic vars 41175 1727204684.66237: variable 'omit' from source: magic vars 41175 1727204684.66274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41175 1727204684.66315: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41175 1727204684.66337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41175 1727204684.66355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204684.66367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41175 1727204684.66403: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41175 1727204684.66407: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.66410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.66497: Set connection var ansible_shell_executable to /bin/sh 41175 1727204684.66502: Set connection var ansible_shell_type to sh 41175 1727204684.66509: Set connection var ansible_pipelining to False 41175 1727204684.66518: Set connection var ansible_timeout to 10 41175 1727204684.66527: Set connection var ansible_connection to ssh 41175 1727204684.66533: Set connection var ansible_module_compression to ZIP_DEFLATED 41175 1727204684.66552: variable 'ansible_shell_executable' from source: unknown 41175 1727204684.66555: variable 'ansible_connection' from source: unknown 41175 1727204684.66558: variable 'ansible_module_compression' from source: unknown 41175 1727204684.66563: variable 'ansible_shell_type' from source: unknown 41175 1727204684.66566: variable 'ansible_shell_executable' from source: unknown 41175 1727204684.66570: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204684.66575: variable 'ansible_pipelining' from source: unknown 41175 1727204684.66578: variable 'ansible_timeout' from source: unknown 41175 1727204684.66584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204684.66708: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204684.66718: variable 'omit' from source: magic vars 41175 1727204684.66729: starting attempt loop 41175 1727204684.66732: running the handler 41175 1727204684.66743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41175 1727204684.66761: _low_level_execute_command(): starting 41175 1727204684.66770: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41175 1727204684.67327: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.67331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.67334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.67336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.67393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.67397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204684.67402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.67447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.69199: stdout chunk (state=3): >>>/root <<< 41175 1727204684.69307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.69366: stderr chunk (state=3): >>><<< 41175 1727204684.69369: stdout chunk (state=3): >>><<< 41175 1727204684.69392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204684.69404: _low_level_execute_command(): starting 41175 1727204684.69411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724 `" && echo ansible-tmp-1727204684.6939223-43278-260006091664724="` echo /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724 `" ) && sleep 0' 41175 1727204684.69886: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.69889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.69901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204684.69904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204684.69907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.69953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.69957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.70002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.71976: stdout chunk (state=3): >>>ansible-tmp-1727204684.6939223-43278-260006091664724=/root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724 <<< 41175 1727204684.72102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.72149: stderr chunk (state=3): >>><<< 41175 1727204684.72153: stdout chunk (state=3): >>><<< 41175 1727204684.72167: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204684.6939223-43278-260006091664724=/root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204684.72200: variable 'ansible_module_compression' from source: unknown 41175 1727204684.72244: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41175qa2aqh90/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41175 1727204684.72281: variable 'ansible_facts' from source: unknown 41175 1727204684.72339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/AnsiballZ_command.py 41175 1727204684.72453: Sending initial data 41175 1727204684.72457: Sent initial data (156 bytes) 41175 1727204684.72921: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204684.72924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204684.72927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 41175 1727204684.72929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 41175 1727204684.72932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.72984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204684.72990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.73024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.74620: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41175 1727204684.74651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41175 1727204684.74687: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpcd1tklij /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/AnsiballZ_command.py <<< 41175 1727204684.74701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/AnsiballZ_command.py" <<< 41175 1727204684.74721: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-41175qa2aqh90/tmpcd1tklij" to remote "/root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/AnsiballZ_command.py" <<< 41175 1727204684.74727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/AnsiballZ_command.py" <<< 41175 1727204684.75588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.75652: stderr chunk (state=3): >>><<< 41175 1727204684.75658: stdout chunk (state=3): >>><<< 41175 1727204684.75676: done transferring module to remote 41175 1727204684.75687: _low_level_execute_command(): starting 41175 1727204684.75694: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/ /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/AnsiballZ_command.py && sleep 0' 41175 1727204684.76144: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204684.76147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.76150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.76152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204684.76201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204684.76222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.76253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204684.82288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204684.82368: stderr chunk (state=3): >>><<< 41175 1727204684.82372: stdout chunk (state=3): >>><<< 41175 1727204684.82476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204684.82480: _low_level_execute_command(): starting 41175 1727204684.82483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/AnsiballZ_command.py && sleep 0' 41175 1727204684.83071: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204684.83086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204684.83110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204684.83145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 41175 1727204684.83253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 41175 1727204684.83270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204684.83291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204684.83371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204685.01532: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2441sec preferred_lft 2441sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:04:45.004189", "end": "2024-09-24 15:04:45.013086", "delta": "0:00:00.008897", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41175 1727204685.03115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204685.03223: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 41175 1727204685.03236: stdout chunk (state=3): >>><<< 41175 1727204685.03250: stderr chunk (state=3): >>><<< 41175 1727204685.03396: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2441sec preferred_lft 2441sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:04:45.004189", "end": "2024-09-24 15:04:45.013086", "delta": "0:00:00.008897", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 41175 1727204685.03407: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41175 1727204685.03410: _low_level_execute_command(): starting 41175 1727204685.03412: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204684.6939223-43278-260006091664724/ > /dev/null 2>&1 && sleep 0' 41175 1727204685.04052: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 41175 1727204685.04075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41175 1727204685.04093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41175 1727204685.04199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41175 1727204685.04224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41175 1727204685.04242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41175 1727204685.04264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41175 1727204685.04328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41175 1727204685.06364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41175 1727204685.06368: stdout chunk (state=3): >>><<< 41175 1727204685.06376: stderr chunk (state=3): >>><<< 41175 1727204685.06408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41175 1727204685.06426: handler run complete 41175 1727204685.06505: Evaluated conditional (False): False 41175 1727204685.06509: attempt loop complete, returning result 41175 1727204685.06511: _execute() done 41175 1727204685.06513: dumping result to json 41175 1727204685.06516: done dumping result, returning 41175 1727204685.06594: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [12b410aa-8751-f070-39c4-0000000008e7] 41175 1727204685.06597: sending task result for task 12b410aa-8751-f070-39c4-0000000008e7 ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008897", "end": "2024-09-24 15:04:45.013086", "rc": 0, "start": "2024-09-24 15:04:45.004189" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2441sec preferred_lft 2441sec inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 41175 1727204685.06815: no more pending results, returning what we have 41175 1727204685.06823: results queue empty 41175 1727204685.06825: checking for any_errors_fatal 41175 1727204685.06827: done checking for any_errors_fatal 41175 1727204685.06828: checking for max_fail_percentage 41175 1727204685.06830: done checking for max_fail_percentage 41175 1727204685.06832: checking to see if all hosts have failed and the running result is not ok 41175 1727204685.06833: done checking to see if all hosts have failed 41175 1727204685.06834: getting the remaining hosts for this loop 41175 1727204685.06836: done getting the remaining hosts for this loop 41175 1727204685.06842: getting the next task for host managed-node3 41175 1727204685.06850: done getting next task for host managed-node3 41175 1727204685.06854: ^ task is: TASK: Verify DNS and network connectivity 41175 1727204685.06858: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204685.06867: getting variables 41175 1727204685.06869: in VariableManager get_vars() 41175 1727204685.07119: Calling all_inventory to load vars for managed-node3 41175 1727204685.07123: Calling groups_inventory to load vars for managed-node3 41175 1727204685.07128: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204685.07135: done sending task result for task 12b410aa-8751-f070-39c4-0000000008e7 41175 1727204685.07138: WORKER PROCESS EXITING 41175 1727204685.07151: Calling all_plugins_play to load vars for managed-node3 41175 1727204685.07156: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204685.07160: Calling groups_plugins_play to load vars for managed-node3 41175 1727204685.10140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204685.13416: done with get_vars() 41175 1727204685.13464: done getting variables 41175 1727204685.13545: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:04:45 -0400 (0:00:00.485) 0:00:52.274 ***** 41175 1727204685.13594: entering _queue_task() for managed-node3/shell 41175 1727204685.14139: worker is 1 (out of 1 available) 41175 1727204685.14153: exiting _queue_task() for managed-node3/shell 41175 1727204685.14163: done queuing things up, now waiting for results queue to drain 41175 1727204685.14165: waiting for pending results... 41175 1727204685.14381: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 41175 1727204685.14513: in run() - task 12b410aa-8751-f070-39c4-0000000008e8 41175 1727204685.14537: variable 'ansible_search_path' from source: unknown 41175 1727204685.14546: variable 'ansible_search_path' from source: unknown 41175 1727204685.14604: calling self._execute() 41175 1727204685.14742: variable 'ansible_host' from source: host vars for 'managed-node3' 41175 1727204685.14780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41175 1727204685.14784: variable 'omit' from source: magic vars 41175 1727204685.15309: variable 'ansible_distribution_major_version' from source: facts 41175 1727204685.15435: Evaluated conditional (ansible_distribution_major_version != '6'): True 41175 1727204685.15547: variable 'ansible_facts' from source: unknown 41175 1727204685.16892: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 41175 1727204685.16959: when evaluation is False, skipping this task 41175 1727204685.16965: _execute() done 41175 1727204685.16968: dumping result to json 41175 1727204685.16970: done dumping result, returning 41175 1727204685.16973: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [12b410aa-8751-f070-39c4-0000000008e8] 41175 1727204685.16976: sending task result for task 12b410aa-8751-f070-39c4-0000000008e8 41175 1727204685.17149: done sending task result for task 12b410aa-8751-f070-39c4-0000000008e8 41175 1727204685.17153: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 41175 1727204685.17218: no more pending results, returning what we have 41175 1727204685.17224: results queue empty 41175 1727204685.17226: checking for any_errors_fatal 41175 1727204685.17239: done checking for any_errors_fatal 41175 1727204685.17240: checking for max_fail_percentage 41175 1727204685.17242: done checking for max_fail_percentage 41175 1727204685.17244: checking to see if all hosts have failed and the running result is not ok 41175 1727204685.17245: done checking to see if all hosts have failed 41175 1727204685.17246: getting the remaining hosts for this loop 41175 1727204685.17248: done getting the remaining hosts for this loop 41175 1727204685.17254: getting the next task for host managed-node3 41175 1727204685.17266: done getting next task for host managed-node3 41175 1727204685.17268: ^ task is: TASK: meta (flush_handlers) 41175 1727204685.17271: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204685.17277: getting variables 41175 1727204685.17279: in VariableManager get_vars() 41175 1727204685.17315: Calling all_inventory to load vars for managed-node3 41175 1727204685.17322: Calling groups_inventory to load vars for managed-node3 41175 1727204685.17327: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204685.17345: Calling all_plugins_play to load vars for managed-node3 41175 1727204685.17350: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204685.17355: Calling groups_plugins_play to load vars for managed-node3 41175 1727204685.20250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204685.23394: done with get_vars() 41175 1727204685.23446: done getting variables 41175 1727204685.23540: in VariableManager get_vars() 41175 1727204685.23555: Calling all_inventory to load vars for managed-node3 41175 1727204685.23558: Calling groups_inventory to load vars for managed-node3 41175 1727204685.23562: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204685.23568: Calling all_plugins_play to load vars for managed-node3 41175 1727204685.23571: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204685.23575: Calling groups_plugins_play to load vars for managed-node3 41175 1727204685.25761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204685.28933: done with get_vars() 41175 1727204685.28988: done queuing things up, now waiting for results queue to drain 41175 1727204685.28994: results queue empty 41175 1727204685.28995: checking for any_errors_fatal 41175 1727204685.28999: done checking for any_errors_fatal 41175 1727204685.29000: checking for max_fail_percentage 41175 1727204685.29002: done checking for max_fail_percentage 41175 1727204685.29003: checking to see if all hosts have failed and the running result is not ok 41175 1727204685.29004: done checking to see if all hosts have failed 41175 1727204685.29005: getting the remaining hosts for this loop 41175 1727204685.29006: done getting the remaining hosts for this loop 41175 1727204685.29011: getting the next task for host managed-node3 41175 1727204685.29019: done getting next task for host managed-node3 41175 1727204685.29022: ^ task is: TASK: meta (flush_handlers) 41175 1727204685.29024: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204685.29027: getting variables 41175 1727204685.29029: in VariableManager get_vars() 41175 1727204685.29041: Calling all_inventory to load vars for managed-node3 41175 1727204685.29044: Calling groups_inventory to load vars for managed-node3 41175 1727204685.29048: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204685.29055: Calling all_plugins_play to load vars for managed-node3 41175 1727204685.29058: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204685.29062: Calling groups_plugins_play to load vars for managed-node3 41175 1727204685.31348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204685.35284: done with get_vars() 41175 1727204685.35428: done getting variables 41175 1727204685.35652: in VariableManager get_vars() 41175 1727204685.35665: Calling all_inventory to load vars for managed-node3 41175 1727204685.35669: Calling groups_inventory to load vars for managed-node3 41175 1727204685.35672: Calling all_plugins_inventory to load vars for managed-node3 41175 1727204685.35678: Calling all_plugins_play to load vars for managed-node3 41175 1727204685.35681: Calling groups_plugins_inventory to load vars for managed-node3 41175 1727204685.35685: Calling groups_plugins_play to load vars for managed-node3 41175 1727204685.44878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41175 1727204685.47957: done with get_vars() 41175 1727204685.48006: done queuing things up, now waiting for results queue to drain 41175 1727204685.48009: results queue empty 41175 1727204685.48010: checking for any_errors_fatal 41175 1727204685.48012: done checking for any_errors_fatal 41175 1727204685.48013: checking for max_fail_percentage 41175 1727204685.48014: done checking for max_fail_percentage 41175 1727204685.48015: checking to see if all hosts have failed and the running result is not ok 41175 1727204685.48019: done checking to see if all hosts have failed 41175 1727204685.48020: getting the remaining hosts for this loop 41175 1727204685.48021: done getting the remaining hosts for this loop 41175 1727204685.48035: getting the next task for host managed-node3 41175 1727204685.48039: done getting next task for host managed-node3 41175 1727204685.48040: ^ task is: None 41175 1727204685.48042: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41175 1727204685.48043: done queuing things up, now waiting for results queue to drain 41175 1727204685.48045: results queue empty 41175 1727204685.48046: checking for any_errors_fatal 41175 1727204685.48046: done checking for any_errors_fatal 41175 1727204685.48047: checking for max_fail_percentage 41175 1727204685.48049: done checking for max_fail_percentage 41175 1727204685.48050: checking to see if all hosts have failed and the running result is not ok 41175 1727204685.48051: done checking to see if all hosts have failed 41175 1727204685.48052: getting the next task for host managed-node3 41175 1727204685.48055: done getting next task for host managed-node3 41175 1727204685.48056: ^ task is: None 41175 1727204685.48057: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=89 changed=6 unreachable=0 failed=0 skipped=92 rescued=0 ignored=1 Tuesday 24 September 2024 15:04:45 -0400 (0:00:00.345) 0:00:52.620 ***** =============================================================================== Install iproute --------------------------------------------------------- 2.63s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 2.43s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.42s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.28s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.21s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.37s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.35s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.31s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface ethtest0 ------------------------------------------ 1.16s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.02s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.96s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` ---------- 0.87s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:135 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.78s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 41175 1727204685.48168: RUNNING CLEANUP